[go: up one dir, main page]

WO2025047451A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2025047451A1
WO2025047451A1 PCT/JP2024/029079 JP2024029079W WO2025047451A1 WO 2025047451 A1 WO2025047451 A1 WO 2025047451A1 JP 2024029079 W JP2024029079 W JP 2024029079W WO 2025047451 A1 WO2025047451 A1 WO 2025047451A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
user
information
match
calculation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/029079
Other languages
French (fr)
Japanese (ja)
Inventor
竜也 小枝
麦 森國
英二 新谷
陽 野々山
玄 阿部
圭祐 伊藤
良広 清水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2025047451A1 publication Critical patent/WO2025047451A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • This technology relates to information processing devices, information processing methods, and programs, and in particular to services that provide content related to actual matches.
  • Patent Document 1 describes a system that acquires information on the progress of a match and the details of plays at the match venue in real time, centrally manages it at a center, and provides it to third parties.
  • Patent Document 2 describes a communication processing system capable of sequentially reporting match information of a plurality of matches held in a plurality of stadiums.
  • JP 2002-268806 A Japanese Patent Application Publication No. 8-299525
  • This disclosure therefore proposes technology for providing services that can enhance the enjoyment of watching a match for spectators.
  • An information processing device relating to the present technology includes a calculation unit that performs a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information, which is information related to the match; a generation process for generating content related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited content by editing the content based on user information related to the user to whom the service is provided. For example, during a game of baseball, soccer, or other sport, spectacular plays or memorable plays are determined as plays for which content is to be generated. Content is generated in response to the occurrence of such plays. In addition, edited content is generated in which content is added or changed based on user information of the service provider.
  • FIG. 1 is an explanatory diagram of an information providing system according to an embodiment of the present technology
  • 1 is a block diagram of an information providing system according to an embodiment
  • FIG. 2 is an explanatory diagram of a functional configuration of the information providing system according to the embodiment
  • FIG. 1 is a block diagram of an information processing apparatus according to an embodiment.
  • 11 is an explanatory diagram of an example screen of the match linkage service according to the embodiment.
  • FIG. 13 is an explanatory diagram of a locker room screen according to an embodiment.
  • FIG. 13 is an explanatory diagram of a key moment notification screen according to the embodiment.
  • FIG. 2 is an explanatory diagram of key moment content according to an embodiment.
  • FIG. 2 is an explanatory diagram of key moment content according to an embodiment.
  • FIG. 2 is an explanatory diagram of key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 2 is an explanatory diagram of an item corresponding to a content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment.
  • 11 is a flowchart of an example of a content providing process according to an embodiment; 4 is a flowchart of a key moment determination according to an embodiment.
  • 11 is a flowchart of an example of a content editing process according to the embodiment.
  • 11 is a flowchart of an example of a content editing process according to the embodiment.
  • 11 is a flowchart of an example of a content editing process according to the embodiment.
  • 11 is a flowchart of an example of a content playback process according to an embodiment.
  • image refers to both moving images and still images.
  • the imaging device 10 shown in Fig. 1 is assumed to mainly capture moving images, but the image displayed on the terminal device 5 may be a moving image or a still image.
  • image refers to an image that is actually displayed on a screen, but “image” in the signal processing process and transmission path until it is displayed on the screen refers to image data.
  • contents and digital items are mainly provided as images, but may also be still images, videos, pseudo-videos made up of a small number of still images, etc. These images may or may not include audio.
  • FIG. 1 shows an overview of an information providing system 1 according to an embodiment.
  • 1 includes an imaging device 10, a main server 2, a score data server 3, a sensor 4, and a terminal device 5. These are connected to each other via wired or wireless communication, or network communication, etc.
  • This information provision system 1 is a system that realizes a service (hereinafter referred to as a "match-linked service") that provides a virtual space, digital items, and various contents linked to a match to a user having a terminal device 5.
  • a match-linked service users are assumed to be fans of teams or players, and when these users watch a match, for example, the virtual space, image content, game content, digital items, and the like that are generated or updated according to the match, or according to the match or user are provided to the terminal device 5. This allows users to enjoy the entertainment provided by the match-linked services in addition to the entertainment of watching the match itself.
  • the multiple imaging devices 10 in the information provision system 1 capture images of a subject area of a sports field such as a baseball stadium, for example, the state of a game, from various positions. Although multiple imaging devices 10 are shown, it is sufficient that at least one imaging device 10 is provided. In the case of baseball, for example, the imaging device 10 is used to capture images for analyzing the movements of the pitcher, fielders, batters, and runners, and for determining the trajectory, type of ball, speed, rotation, and the like.
  • skeletal capture data of the subject player is extracted from the image captured by the imaging device 10, and the player's movements, such as pitching movements, batting movements, and base running movements, can be determined based on the skeletal capture data.
  • the movements of playing objects such as balls and bats may also be determined.
  • EPTS Electronic Performance and Tracking Systems
  • the image capturing device 10 captures images for obtaining such EPTS data as skeletal capture data. Moreover, the images captured by the imaging device 10 can also be used as actual images of a match or the like.
  • the EPTS data generated based on images captured by the imaging device 10 is transmitted to the main server 2.
  • the main server 2 For example, if an information processing device (not shown) that records images captured by a plurality of imaging devices 10 and generates EPTS data is provided at a sports stadium such as a baseball stadium, the EPTS data generated by the information processing device is transmitted to the main server 2.
  • the captured image obtained by the imaging device 10 may be transmitted to the main server 2, and the main server 2 may generate the EPTS data.
  • Sensor 4 is a sensor that detects the movements of players, the ball, etc. Specifically, it is assumed that this is a sensor attached to the player or the ball, such as an acceleration sensor, gyro sensor, GPS (Global Positioning System) sensor, altitude sensor, or impact sensor. For example, it may be a wearable sensor attached to the player's uniform or body, or it may be a sensor built into the ball. It may also be a radar, etc.
  • the score data server 3 is an information processing device of an organization that generates and distributes, for example, baseball score data SD.
  • the score data server 3 is a server device that can transmit the score data SD of a game to the main server 2.
  • the score data server 3 may transmit the score data SD to the main server 2 successively during a match, or may transmit all of the score data SD for that match all at once, for example, after the match.
  • the score data SD includes various information about the game, such as: For example, it includes not only information about the entire game, such as the score for each inning of the game and information about the players participating, but also information about each pitch thrown by the pitcher. For example, it includes information about each pitch, such as the time of the pitch, pitcher's name, batter's name, whether the batter is left-handed or right-handed, whether there are runners on first/second/third base, runners' names, strike/ball count, ball/strike, out count, batting result (hit, strikeout, missed, foul, grounder out, fly out, foul fly, etc.), ball speed, type of ball, number of rotations, axis of rotation, course, and other various information.
  • each pitch such as the time of the pitch, pitcher's name, batter's name, whether the batter is left-handed or right-handed, whether there are runners on first/second/third base, runners' names, strike/ball count, ball/strike, out count, batting result (hit, strikeout, missed, foul
  • the terminal device 5 is, for example, an information processing device such as a smartphone, tablet terminal, or personal computer, but as described above, it is assumed that this terminal device 5 is a device carried by a general user.
  • the display unit 5a of the terminal device 5 displays key moment content (described below), a virtual space such as the locker room screen 140, and images including digital items through the match linkage service.
  • the main server 2 performs various processes for presenting information on the terminal device 5 as a match-linked service. For example, it performs processes such as determining the status of the match and players based on skeletal capture data of the subject generated from the image captured by the imaging device 10, and providing the terminal device 5 with key moment content, digital items, and images of the virtual space accordingly.
  • the processing for displaying images of the match linkage service on the terminal device 5 may be performed by an information processing device other than the cloud server.
  • an information processing device such as a personal computer installed at the match venue has the function of the main server 2 and performs processes such as acquiring skeletal capture data and generating images to be provided.
  • the terminal device 5 also has the function of the main server 2 and performs processes such as acquiring skeletal capture data and generating images.
  • This EPTS data generation unit 12 can generate EPTS data for the entire game, for example, information that can be used to determine the pitching motion, ball speed, trajectory, batter's behavior in response to the pitch, batting results, base running results, batter's speed and trajectory, etc., for each of the pitchers' pitches during the game.
  • EPTS data is an example of behavioral information about either the people (pitchers, batters, runners, etc.) or playing objects (balls, bats, etc.) associated with a game.
  • the EPTS data generation unit 12 can generate EPTS data from multiple captured images obtained by multiple imaging devices 10, and can also generate EPTS data from multiple captured images obtained by one imaging device 10. Furthermore, the EPTS data generation unit 12 can generate EPTS data from multiple images and sensing data SS from one or multiple sensors, and can also generate EPTS data from one captured image and sensing data SS from one sensor.
  • the EPTS data (skeleton capture data) generated by the EPTS data generation unit 12 is transmitted to the main server 2 .
  • Image data VD captured by the imaging device 10 is also transmitted to the main server 2 via, for example, an EPTS data generating unit 12 .
  • the sensing data SS may also be transmitted to the main server 2 and used directly for processing by the main server 2 .
  • the EPTS data generator 12 may be provided in the main server 2. In that case, for example, the captured image (image data VD) by the imaging device 10 and the sensing data SS of the sensor 4 may be transmitted to the EPTS data generator 12 in the main server 2 via network communication or the like.
  • the score data server 3 sequentially transmits the score data SD to the main server 2.
  • the main server 2 is configured with an information processing device such as a computer device, and is provided with a calculation unit 21 using a microprocessor or the like, a function as a presentation control unit 22, and storage 23 using a storage medium accessible by the microprocessor or the like.
  • the calculation unit 21 acquires information from the data server 30 and executes various processes for executing the match linkage service using the acquired information, such as data acquisition, match analysis, and generation of virtual space images, digital items, and content.
  • the processing functions of the calculation unit 21 will be described later with reference to FIG. 3.
  • the storage 23 represents a storage medium in the main server 2 for storing information such as received image data VD, skeleton capture data BD, score data SD, etc., and for storing various data generated by the calculation unit 21.
  • the storage 23 may be an internal storage medium of the main server 2 , or may be a storage medium separate from the information processing device that constitutes the main server 2 .
  • the presentation control unit 22 controls the display of various images generated by the processing of the calculation unit 21 in a predetermined manner on the terminal device 5. Specifically, the presentation control unit 22 performs processing to display images of virtual spaces, digital items, etc., generated by the calculation unit 21 during execution of the match linkage service on the display unit 5a of the terminal device 5.
  • the calculation unit 21, presentation control unit 22, and storage 23 may be provided in one information processing device, or may be provided separately in multiple information processing devices. There are various possible ways to realize the storage 23 as a storage medium.
  • Such images of the match linkage service or data for generating images processed by the main server 2 are transmitted to the terminal device 5 and displayed on the display unit 5a. This allows a user who owns the terminal device 5 to visually recognize images provided by the match linkage service. Furthermore, information on user interaction during the provision of the match linkage service may be input to the calculation unit 21 from the terminal device 5 .
  • FIG. 3 illustrates information transmitted from the data server 30 to the calculation unit 21, and the processing functions of the calculation unit 21.
  • this figure illustrates the processing functions for the game linked service function 40, and the processing functions as a shared service that allows results of the game linked service to be exhibited.
  • calculation unit 21 in this embodiment has a processing function of the match linkage service function 40.
  • the calculation unit 21 may have a processing function of the shared service function 50, but the shared service function 50 may also be provided by another server device.
  • EPTS data sensing data SS
  • image data VD stats data
  • historical data analysis data, etc.
  • stats data and historical data are score data SD from the score data server 3.
  • stats data is information on the performance of a player or team.
  • stats data for a baseball pitcher include the team name, league, number of wins, number of losses, earned run average, number of games pitched, number of complete games, number of shutouts, number of saves, number of save opportunities, number of innings pitched, number of runs allowed, number of home runs allowed, number of hit by pitch, number of walks allowed, number of intentional walks, number of strikeouts, hit rate, number of runners allowed per inning pitched (WHIP: Walks plus Hits per Inning Pitched), and ground ball/fly out ratio.
  • WHIP Walks plus Hits per Inning Pitched
  • Historical data is also information about the performance of teams and players, and is included in stats data in a broad sense, but for the sake of explanation here it is shown as information such as past best records, team records, and individual records. For example, it includes information on records that may be achieved in the current game. For example, records of the number of strikeouts in a single game in the past, records of consecutive hits at bats, records of consecutive games played, and records of the number of team hits in a single game. Such historical data is information that can be referenced for editing special digital items in the game linkage service.
  • the analysis data indicates data determined by the function of the data analysis unit 32 regarding the EPTS data, the sensing data SS, and the image data VD.
  • the function of the data analysis unit 32 may be provided in the main server 2, or may be realized by another information processing device.
  • the analysis data include a specific action of a player, a trajectory of a ball, etc., determined from EPTS data, sensing data SS, etc.
  • tracking data of the movement of players and playing objects in a game In baseball, the movements of players include the pitcher's pitching motion, the catcher's catching motion, the batter's batting motion, the fielders' defensive motion, the runners' running motion, etc.
  • the overall movements of players in various sports can be obtained as tracking data.
  • the EPTS data, sensing data SS, image data VD, stats data, historical data, and analysis data supplied from the data server 30 to the calculation unit 21 are information related to the match, and will be referred to as match information.
  • This match information in terms of content, refers to information about the match itself and information about the players and teams playing in the match, and is distinguished from user information, which will be described later.
  • the information about the game itself includes the game date and time, the venue, the weather, the score, and information about the progress of the game (current inning, out count, etc.).
  • the player information includes information about each individual player, such as the player's performance in the match, past performance, images of the match, details of the play, and information about the player's movements.
  • Team information refers to information about the entire team, such as the team's match results, past results, game images, participating players, and substitute players.
  • the calculation unit 21 can perform processing for the match linkage service function 40.
  • the EPTS data and the analysis data may be generated by the calculation unit 21 based on the image data VD and the sensing data SS.
  • Information input from the terminal device 5 to the calculation unit 21 includes user information and information regarding purchases/listing/sharing/exchanges.
  • match information there are two sources of information: Information provided by the score data server 3. Information obtained based on signals from the imaging device 10 and the sensor 4. The content of the information obtained from these two types of information sources may be different or the same.
  • match information When considering the content types of match information, in this embodiment it is classified into “fixed match information” and “match progress information.”
  • Fixed match information is information that does not change as the match progresses.
  • information such as the venue, start date and time of the game, season, team, starting members, weather information such as the weather and temperature at the start of the game, team performance before the game, players on the bench, player and team performance before the game (stats data, historical data), referee, type of game (official game, championship game, magic number game to win, etc.), player condition before the game, player profile (birthday, hometown, etc.), etc.
  • Match progress information is information obtained as the match progresses. For example, there is the score of the game (game progress/result), innings/time/half in progress, weather changes during the game, player and team performance as the game progresses (stats data, historical data), participating players, information on each play during the game, player behavior during each play (EPTS information/sensing data SS/analysis data), image data VD of the game, achievement records, etc.
  • the user information refers to information that the main server 2 can obtain about the target user (the user who is logged in to the match linkage service function 40).
  • the user information is classified into "user fixed information” and "user status information” in terms of content type.
  • the user situation information is information that indicates the situation of the user with respect to the current match or the match-linked service, and includes, for example, information related to the user's location, and information according to the user's actions and interactions.
  • the information relating to the user's location includes, for example, the user's current location information, information about the facility or store where the user is currently located, etc. It may also include seat information in a stadium, area information, etc.
  • Information corresponding to the user's behavior and interaction includes the user's movement status, vocalizations, actions, cheers from the user (or the entire audience), answers to quizzes provided by the match collaboration service, information entered by the user regarding the current match indicating the team or player they are supporting, and information entered by the user during the match (for example, operation information, participation information, answer information, etc. regarding the quizzes provided by the match collaboration service).
  • information on the user's actions and interaction history may be added sequentially to the user's fixed information as history information.
  • the calculation unit 21 inputs the above-mentioned match information (fixed match information, match progress information) and user information (fixed user information, user status information) for processing by the match linkage service function 40.
  • the match analysis unit 43 performs processing to analyze the match based on match information and user information and determine the match situation. In addition, the match analysis unit 43 performs processing to determine the timing to request a user response regarding the service, and the timing to edit digital items and provide content, depending on the determination of the match situation.
  • the quiz generation unit 44 generates quizzes and questionnaires to be provided to the user and transmits them to the terminal device 5.
  • the quiz generation unit 44 can also perform processing to request user interaction in addition to quizzes and questionnaires.
  • the item/content generation unit 45 generates digital items.
  • the item/content generation unit 45 also generates image content such as digest videos of matches, free viewpoint images, and CG (computer graphics) images.
  • the item/content generation unit 45 also generates information that is ultimately associated with digital items.
  • the item/content generation unit 45 also generates and awards points, which will be described later.
  • the determination unit 45a determines that a specific type of play determined based on the match information is a target play.
  • the determination unit 45a can also determine the occurrence of a target play based on the match information and user information.
  • the generating and editing unit 45b performs a generating process to generate content (key moment content) related to a target play in response to a key moment determination. Furthermore, generating/editing unit 45b performs an editing process for generating edited key moment content by adding or modifying content based on user information relating to the user to whom the service is provided.
  • the storage control unit 46 performs the process of storing and reading information in the storage 23 regarding the match linkage service function 40.
  • the game processing unit 48 is a function that performs processing related to game content provided on the screen of the terminal device 5.
  • This game content is, for example, a game provided by an application program or a browser screen, and the game content types are diverse. However, this game has the functionality to bring about changes linked to actual match information and user information. For example, in-game parameters and game items may be changed depending on match information, user information, and digital items related to the locker room screen 140.
  • the calculation unit 21 of the main server 2 also has a shared service function 50 in addition to the match linkage service function 40 described above.
  • This shared service function 50 may include a function as an NFT (Non-Fungible Token) market.
  • the NFT market function is a service that allows users to list and sell NFT works that they have acquired, or purchase NFT works that are being listed.
  • the digital items generated by the match linking service function 40 are edited according to the progress of the match and user interaction, and ultimately become digital items unique to the user. These can then be put up for sale or purchased as NFT works in the sharing service function 50.
  • the shared service function 50 is provided with a smart contract function 51 and a payment function 52.
  • the smart contract function 51 is a function that automatically executes predetermined buying and selling processing in accordance with predetermined conditions for buying and selling transactions.
  • the settlement function 52 is a function for carrying out settlement processing regarding buying and selling.
  • the sharing service function 53 is a function that allows a user to share items, content, etc. that they own with others free of charge.
  • the exchange service function 54 is a function that allows a user to exchange items, content, etc. owned by the user for items, content, etc. owned by other people.
  • the match linkage service function 40 can offer some kind of digital item to such a sharing service function 50 and make it available for purchase by users.
  • match linkage service function 40 allows users to sell digital items they have acquired, purchase digital items sold by others, share them with others, and exchange them for other people's digital items.
  • the functional configuration in FIG. 3 is just an example.
  • Some of the functions of the match linkage service function 40 can also be processed on the terminal device 5 side.
  • some or all of the functions of the quiz generation unit 44 , the match analysis unit 43 , the item/content generation unit 45 , and the game processing unit 48 may be executed outside the match linkage service function 40 .
  • some of the functions of the calculation unit 21 may be provided on the side of the terminal device 5.
  • a calculation unit including the item/content generation unit 45 may be provided on the terminal device 5.
  • the following describes the configuration of an information processing device 70 used in the information provision system 1 described above with reference to Figures 1, 2, and 3.
  • the main server 2, the terminal device 5, and the EPTS data generator 12 in Figure 2 can be realized by the information processing device 70 shown in Figure 4.
  • the information processing device 70 can be configured as, for example, a dedicated workstation, a general-purpose personal computer, a mobile terminal device, or the like.
  • the CPU 71 of the information processing device 70 shown in FIG. 4 executes various processes according to programs stored in a ROM 72 or a non-volatile memory unit 74, such as an EEPROM (Electrically Erasable Programmable Read-Only Memory), or programs loaded from a storage unit 79 to a RAM 73.
  • the RAM 73 also stores data necessary for the CPU 71 to execute various processes, as appropriate.
  • the image processing unit 85 is configured as a processor that performs various types of image processing.
  • it is a processor that can perform any of the following: image generation processing for video clips, image analysis processing for captured images, generation processing for animation images and CG images, DB (Data Base) processing, image effect processing, EPTS data generation processing, etc.
  • This image processing unit 85 can be realized, for example, by a CPU separate from the CPU 71, a graphics processing unit (GPU), a general-purpose computing on graphics processing units (GPGPU), an artificial intelligence (AI) processor, or the like.
  • the image processing unit 85 may be provided as a function within the CPU 71 .
  • the CPU 71, ROM 72, RAM 73, non-volatile memory unit 74, and image processing unit 85 are interconnected via a bus 83.
  • the input/output interface 75 is also connected to this bus 83.
  • An input unit 76 including an operator or an operating device is connected to the input/output interface 75.
  • the input unit 76 may be various operators or operating devices such as a keyboard, a mouse, a key, a dial, a touch panel, a touch pad, or a remote controller.
  • An operation by the user is detected by the input unit 76 , and a signal corresponding to the input operation is interpreted by the CPU 71 .
  • the input/output interface 75 is also connected, either integrally or separately, to a display unit 77 such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) panel, and an audio output unit 78 such as a speaker.
  • a display unit 77 such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) panel
  • an audio output unit 78 such as a speaker.
  • the display unit 77 performs various displays as a user interface.
  • the display unit 77 is, for example, a display device provided in the housing of the information processing device 70, or a separate display device connected to the information processing device 70.
  • the display unit 77 executes various image displays on the display screen based on instructions from the CPU 71.
  • the display unit 77 also displays various operation menus, icons, messages, etc., that is, GUIs (Graphical User Interfaces), based on instructions from the CPU 71.
  • the display unit 77 corresponds to the display unit 5a, and displays a screen using the match linkage service function 40, such as a screen including key moment content and digital items, which will be described later.
  • the input/output interface 75 may be connected to a storage unit 79 configured with a solid state drive (SSD) or a hard disk drive (HDD) or a communication unit 80 configured with a modem or the like.
  • a storage unit 79 configured with a solid state drive (SSD) or a hard disk drive (HDD) or a communication unit 80 configured with a modem or the like.
  • the memory unit 79 can be considered as an example of the storage 23 in which information is stored by the memory control unit 46 .
  • the communication unit 80 performs communication processing via a transmission path such as the Internet, and communication with various devices via wired/wireless communication, bus communication, and the like.
  • a drive 82 is also connected to the input/output interface 75 as required, and a removable recording medium 81 such as a flash memory, a memory card, a magnetic disk, an optical disk, or a magneto-optical disk is appropriately attached thereto.
  • the drive 82 allows data files such as image files and various computer programs to be read from the removable recording medium 81.
  • the read data files are stored in the storage unit 79, and images and sounds contained in the data files are output on the display unit 77 and the sound output unit 78.
  • the computer programs and the like read from the removable recording medium 81 are installed in the storage unit 79 as necessary.
  • software can be installed via network communication by the communication unit 80 or via a removable recording medium 81.
  • the software may be pre-stored in the ROM 72, the storage unit 79, etc.
  • the software provides the functions of the calculation unit 21 and presentation control unit 22 described above, and the CPU 71 and image processing unit 85 execute the processing of these functions.
  • the match linkage service is a service that provides digital items, image content, game content, virtual space, and the like related to a sport in accordance with match information related to a sport such as baseball.
  • FIG. 5 shows some examples of screens provided by the match linkage service function 40.
  • a top screen 100 a match screen 110, a cheering screen 120, a mini-game screen 130, and a locker room screen 140.
  • the user can view these screens by transitioning between them as desired.
  • the top screen 100 displays information about the ongoing match, images for screen transitions, images related to the logged-in user, etc.
  • icons 101, 102, 103, 104, and 105 are displayed as selectable items.
  • Icon 101 is an icon for an operation to transition to a locker room screen 140 as a personal page.
  • Icon 102 is an icon for an operation to transition to a cheering screen 120.
  • Icon 103 is an icon for an operation to transition to a match screen 110.
  • Icon 104 is an icon for an operation to transition to a mini-game screen 130.
  • Icon 105 is an icon for an operation to transition to a setting screen (not shown).
  • FIG. 6 shows an example of the locker room screen 140.
  • the icon 101 representing the locker room screen 140 is displayed in a different manner from the other icons.
  • the frame of the icon 101 is displayed with a thicker frame for emphasis than the frames of the other icons.
  • the game screen 110 in FIG. 5 is a screen that provides the user with digital items that change according to the progress of the game being watched, a quiz for predicting the game, and the like.
  • Digital items are, for example, items related to a sport, and in the case of baseball, designs of baseball balls, bats, gloves, etc. are conceivable. For example, items may have a design in which a team logo or the like is added to a ball.
  • the match screen 110 displays information about digital items, the match, and the user. For example, team/score information, point information, etc. are displayed.
  • the point information is points for the user, that is, the user who has logged in to the match linkage service and is viewing the provided screen on the terminal device 5.
  • points are awarded to users according to the match situation and the user situation. For example, points are awarded according to the user's input or actions, the user's answers to quizzes, or the performance of players selected by the user.
  • the cheering screen 120 is a screen for fans to cheer on each other and get excited, providing communication between fans and allowing fans to share exciting moments. It is also possible that playback of key moment content, which will be described later, is performed on cheering screen 120.
  • the mini-game screen 130 is a screen that provides game contents to the user.
  • game content such as batting games and pitching games are available, and users can enjoy these games.
  • pitching and batting in the game are set based on tracking data from an actual match, so that users can experience pitching and the like of actual players on the game screen.
  • the locker room screen 140 is a page that virtually displays a locker room as a user's home page.
  • the locker room screen 140 is shown in FIG.
  • the locker room screen 140 is prepared as a home page for each user who subscribes to the match linkage service, for example.
  • the locker room screen 140 in FIG. 6 is a screen provided for a user displayed as the user name 202, and serves as the home page when that user logs in to the match linkage service.
  • the locker 201 is set to, for example, a basic shape, but is displayed in various states for each user.
  • various items 203 can be placed on shelves or the floor inside the locker 201. The user can arbitrarily select an item 203 from among the items 203 he or she owns and place it in an arbitrary location.
  • Items 203 include items used in baseball, such as bats, helmets, hats, cleats, and uniforms. Items 203 also include items unrelated to baseball, such as casual clothing, plants, accessories, and food. There may also be items prepared for fans of teams and players, such as player figures, team character logo goods, team mascot dolls, and medals.
  • the match collaboration service provides items 203 as various virtual items, either for a fee or free of charge, so that each user can obtain them.
  • the user can, for example, purchase the item 203 in a virtual shop, obtain the item 203 as a prize for answering a quiz correctly, or exchange it with another person.
  • the user can place the item 203 obtained in this way in his or her own locker 201.
  • item 203A is an item that indicates key moment content, which will be described later.
  • item 203A corresponding to that key moment content is also given to the user, and can be placed in locker 201.
  • Users may also be able to customize the color of the locker 201 itself and the arrangement of the shelves.
  • a poster 204 is also attached to the locker 201.
  • This poster 204 functions as a display unit for various images. For example, still images and videos of the user's favorite player, and still images and videos posted by the user are displayed on the poster 204.
  • a virtual speaker 206 can be placed inside the locker 201 .
  • the user can obtain a speaker 206 provided as one of the items 203 by purchasing it from the item screen 250 or by having it presented to them based on quiz results, point rankings, match results, etc.
  • audio is output when the locker room screen 140 is viewed.
  • the user can play back sound or music on the terminal device 5 while viewing the locker room screen 140. For example, the theme song of a player is played. Also, if other users are commenting on the game, you can listen to the commentary.
  • the locker room screen 140 also displays user information 205. For example, the like icon, number of likes, comment icon, number of comments, and a description of the user are displayed.
  • a unique locker room screen 140 is created for each user and placed within the virtual clubhouse. Therefore, a large number of virtual locker room screens 140 will be created by a large number of users.
  • Fig. 7 is an example of a key moment notification screen 121. This is a screen that is displayed when, for example, a key moment occurs while the user is viewing the cheering screen 120 on the terminal device 5 while watching a game, and key moment content 300 (hereinafter abbreviated as "KM content 300") shown in Fig. 8 and the like is generated.
  • KM content 300 key moment content 300
  • the user can play back the edited KM content 300S acquired as item 203A at will.
  • the user can view the edited KM content 300S as shown in FIG. 10 by performing an operation on item 203A. This allows the user to remember the target play, and the watching icon 301 allows the user to recall that he or she was there watching the target play.
  • FIG. 13 shows the edited KM content 300S to which a commemorative icon 302 based on user information has been added. For example, if the day of a match is the user's birthday, a commemorative icon 302 indicating that it is the birthday is added. This allows the user to recall that the target play shown in the edited KM content 300S was a play on their birthday.
  • FIG. 14 shows an example in which information about the user's location when watching a game is used as a commemorative icon 302. For example, if the location of the user watching the game is not the stadium, a commemorative icon 302 indicating watching the game at home, watching the game at a store such as a sports bar, watching the game at a public viewing, etc. can be added. This allows the user to recall the location of the game where the target play was watched.
  • the watching icon 301 may be an icon of the local area
  • the commemorative icon 302 may be added to indicate the seat location or seat area (e.g., behind the backstop, infield seats on the first base side, outfield seats on the left field side, etc.).
  • the edited KM content 300S to which the spectator icon 301 and the commemorative icon 302 have been added has been shown, but the content of the KM content 300 itself may be edited to change.
  • the KM content 300 as shown in Fig. 8 is initially viewed, but when it is given to a user, it is edited to change the viewpoint position.
  • the KM content 300 in Fig. 8 is a CG image from the viewpoint of the infield seats on the first base side, but if the user is watching the game from the infield seats on the third base side, the edited KM content 300S may be a CG image with the viewpoint position set to the infield seats on the third base side, as shown in Fig. 15.
  • the edited KM content 300S may be a CG image from the player's viewpoint.
  • FIG. 16 and 17 show an example of editing with user-specific information, for example, a user's avatar.
  • FIG. 16 shows an example of edited KM content 300S in which a spectator icon 301 and an avatar 303 are added to the KM content 300.
  • FIG. 17 shows an example of editing the KM content 300, in which a CG image of a player is replaced with an avatar 303 of the user. In this way, it is possible to add information specific to a user, or to change the image contents using the specific information.
  • Such edited KM content 300S can be user specific.
  • processing example> examples of processing by the main server 2 for implementing processing related to the KM content 300 in the match linkage service will be described with reference to Fig. 18 to Fig. 23. These processes are examples of processing executed by the match linkage service function 40 in Fig. 3 in the calculation unit 21 of the main server 2, and in particular, show processing related to the provision of content by the item/content generation unit 45.
  • the processing of the calculation unit 21 is executed by the CPU 71 when the information processing device 70 in FIG. However, a part of the processing may be performed on the side of the terminal device 5.
  • the process of FIG. 18 is performed for each terminal device 5 of a user who is logged in to the match linkage service.
  • the calculation unit 21 monitors the start of the match in step S101 of FIG. 18. Since the KM content 300 is generated in response to key moments in the match, the calculation unit 21 starts processing from step S102 onwards in response to the start of the match.
  • step S102 the calculation unit 21 monitors the end of the match. If the match is still in progress, the process proceeds to step S103.
  • step S103 the calculation unit 21 performs a key moment determination.
  • FIG. 19 shows an example of a process for determining a key moment.
  • the calculation unit 21 acquires game progress information in step S150 and performs a play determination, i.e., determines what kind of play is currently occurring. For example, pitching and batting results include strikeouts, walks, hits, doubles, triples, home runs, errors, stolen bases, and other types of individual plays, as well as the players involved in those plays.
  • step S151 the calculation unit 21 determines whether or not the recognized play can be evaluated as a common key moment.
  • the common key moment is a moment when a play occurs that can be evaluated as a moving or exciting moment, not limited to an individual user, but generally.
  • an individual key moment which will be described later, is a key moment determined for each individual user according to the attributes of the user.
  • Common key moments can be, for example, plays during scoring scenes, memorable plays such as home runs, strikeouts, and 2,000 hits, fine plays, triple plays, etc.
  • step S150 not only game progress information but also information regarding the cheers of the spectators in the stadium (volume of cheers, sound of applause, etc.) can be input, and the calculation unit 21 can use the sounds of the plays to determine the common key moments.
  • step S154 determines that a key moment has occurred that applies to all users logged in to the match linkage service.
  • step S152 the calculation unit 21 proceeds to step S152 and checks the user information of the user being processed. For example, it determines the user's favorite team and the player they support. Then, in step S153, the calculation unit 21 determines whether an individual key moment has occurred for the user being processed.
  • step S155 determines that a key moment has occurred for the user being processed.
  • step S153 the calculation unit 21 ends the process from step S153 to FIG. 19, i.e., the process of step S103 in FIG. 18.
  • the calculation unit 21 returns from step S104 to step S102 in FIG.
  • the calculation unit 21 proceeds from step S104 to step S105 and performs a process of generating the KM content 300.
  • the calculation unit 21 creates a CG image according to the content of the target play determined to be a key moment.
  • the calculation unit 21 may generate one type of KM content 300 common to all users for the target play, or may generate multiple types of KM content 300 for the target play for each user or user group.
  • step S106 the calculation unit 21 notifies the terminal device 5 of the user to be processed that the KM content 300 is viewable.
  • the calculation unit 21 controls the terminal device 5 to display the key moment notification screen 121 of FIG. In the case of a common key moment, all logged-in users will be notified, and in the case of an individual key moment, only the relevant user among the logged-in users will be notified.
  • a user who sees the key moment notification screen 121 can select whether or not to view the KM content 300 by using the view button 124 . If the user does not operate the view button 124, for example, if the user operates the close button 123, the calculation unit 21 returns from step S107 to step S102.
  • step S108 determines whether the user being processed is a local user. If the user is a local user, i.e., a user watching the game at the stadium, the calculation unit 21 proceeds to step S109 and causes the terminal device 5 to play back the KM content 300 as shown in FIG. 8. In this case, it can be assumed that the user is watching the game in person, so the watch button 310 as shown in FIG. 9 is not displayed.
  • step S110 If it is determined from the user's location information that the user is not a local user, the calculation unit 21 proceeds to step S110 and executes playback of the KM content 300. In this case, since it is unclear whether the user is watching the game or not, a watch button 310 as shown in FIG. 9 is displayed.
  • step S120 the calculation unit 21 proceeds to step S120.
  • the KM content 300 is presented to the user as a memento of having been present at the target play (witnessing it directly or indirectly) and having viewed the KM content 300.
  • an edited KM content 300S to be presented to the user is generated according to the user's situation.
  • FIG. 21 shows an example in which the contents are also changed as in FIG.
  • the calculation unit 21 sets a spectating icon 301 according to the user's spectating style.
  • the calculation unit 21 changes the content according to the user information. For example, information on the user's seat position and seat area is acquired, and the user's viewpoint position is determined accordingly. Then, the image content of the KM content 300 is changed to an image from the user's viewpoint position. Alternatively, if the user is a fan of a player involved in the play, it is possible to change the image to one that features the player, or to one that looks like the play is seen from the player's viewpoint.
  • FIG. 23 shows an example in which an avatar is used like in FIG. 16 and FIG.
  • the calculation unit 21 sets a watching icon 301 according to the watching mode of the user.
  • the calculation unit 21 modifies the KM content 300 using the user's avatar. For example, editing is performed by adding an avatar 303 or using the avatar 303 as the image content of the KM content 300.
  • the calculation unit 21 performs a process of adding a spectating icon 301 to the KM content 300 that has been modified using the avatar, and sets the resulting KM content 300S after editing.
  • step S102 the calculation unit 21 proceeds from step S102 to step S125, and performs a process of granting the edited KM content 300S and the item 203A to the user.
  • This allows the user to obtain the item 203A and the edited KM content 300S related to the play of the key moment that the user witnessed as a spectator.
  • the user can place the acquired item 203A on the locker room screen 140 or view it on a collection screen (not shown).
  • the information processing device 70 serving as the main server 2 in the embodiment includes a calculation unit 21 that performs a determination process for determining whether or not a target play for which KM content is to be generated has occurred during a competitive match, based on match information, which is information related to the match; a generation process for generating KM content 300 related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited KM content 300S by editing the KM content 300 based on user information related to the user to whom the service is provided.
  • This allows the user to view the KM content 300 related to the play determined to be a key moment.
  • the user can also obtain edited KM content 300S to which content has been added or changed, for example by adding a spectator icon 301.
  • edited KM content 300S that shows the key moment and the user's situation at that time can be obtained.
  • This allows the user to obtain memorable edited KM content 300S that reflects the user's situation as well as the content of the play that moved or excited the user during the game, providing the user with a new way to enjoy watching the game.
  • the determination process, generation process, and editing process may be shared among a plurality of information processing devices.
  • the calculation unit 21 upon determining that a target play has occurred, the calculation unit 21 generates KM content 300 related to the target play based on match information, and performs processing to notify the service destination terminal device 5 that the content can be viewed (see step S106 in Figures 7 and 18).
  • This allows the user to know the existence of the KM content 300 and view it during or after the match when the KM content 300 is generated for a play determined to be a key moment. For example, the user is notified immediately after a key moment, knows the existence of the KM content 300 for that play, and can enjoy the content about the special moment he or she just watched.
  • the calculation unit 21 generates an item 203A corresponding to the edited KM content 300S to which content has been added or changed based on user information regarding the user to whom the service is provided, and performs a process of associating the edited KM content 300S and the item 203A with the user (see step S125 in Figure 18).
  • This allows the user to acquire the edited KM content 300S and the item 203A.
  • the user can acquire the one and only edited KM content 300S and the item 203A that express his/her own situation and key moment, and use them as mementos of the game he/she watched.
  • the calculation unit 21 performs processing for playing back the edited KM content 300S by operating the item 203A (see FIG. 24). For example, by performing an operation to designate an item 203A, the edited KM content 300S corresponding to that item 203A is played on the terminal device 5. In this way, a user who has acquired the edited KM content 300S and the item 203A can view the edited KM content 300S at any time and recall key moments in a match or his/her own situation.
  • the calculation unit 21 generates the KM content 300 that expresses the movement of a person (player or referee) or a playing object related to a target play in the process of generating the KM content 300.
  • human movements related to the target play include a player's pitching form, batting form, and base running, and an umpire's gestures.
  • a possible example of the movement of a playing object is the trajectory of a ball or the like.
  • the calculation unit 21 generates the KM content 300 as a CG animation expressing these movements based on tracking data of the players, ball, etc.
  • the KM content 300 can be generated as a video, still image, slide show, etc. using captured images of the movements.
  • the KM content 300 may be generated by combining captured images and CG. This makes it possible to provide the user with the KM content 300 that dynamically expresses the target play.
  • animals participating in a race such as horses in a horse race, it is possible to generate content that depicts the movements of the animals.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image representing that the user has been watching the target play to the content.
  • a spectator icon 301 indicating that the user has been watching the play is added to the KM content 300. This allows the user to recognize that the play was what he or she saw in real time when viewing the edited KM content 300S later.
  • the ownership of the edited KM content 300S serves as a memorization or proof of the fact that the user witnessed a memorable play, making the edited KM content 300S a valuable digital content for the user.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image indicating whether the user was watching the target play directly or indirectly to the content as an image representing that the user was watching the target play (see Figures 10 and 12).
  • a viewing icon 301 with a different design is added to the KM content 300 for a user who is watching the game directly at the stadium and a user who is watching the game indirectly via television broadcasting, network distribution, etc.
  • the edited KM content 300S can be a record of the viewing style in which the user watched the game.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image to the content representing that the user was watching the target play based on the user's location information or in response to the user performing a specified operation related to the content (see Figure 9).
  • the watch icon 301 is added to the KM content 300. This makes it possible to generate an edited KM content 300S that indicates that the user was watching the game.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image representing user information related to the date and time of occurrence of the target play to the content based on user information (see Figure 13). For example, if the day on which the target play occurred is the user's birthday, a commemorative icon 302 indicating the birthday is added to the KM content 300. In this way, the edited KM content 300S can record the date and time on which the user witnessed the target play. For example, it can also evoke memories of watching a game for the user.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image representing the viewing location of the target play to the content based on the user information (see FIG. 14).
  • a spectator icon 301 that indicates where the user was watching the game when the target play occurred is added to the KM content 300. For example, if the game was watched at a stadium, an image showing the seat position, seat area, etc. is added. Even if the game was watched via broadcast or distribution, an image that shows, for example, watching the game at home, watching the game at a public viewing, or watching the game at a specific establishment such as a sports bar is added. In this way, the edited KM content 300S can record where the user himself was watching the game. For example, it can also be used to evoke memories of watching the game in the user's mind.
  • the calculation unit 21 performs editing processing on the KM content 300 to add an image representing the user to the content (see FIG. 16).
  • an image representing the user For example, a user's avatar 303, a photographic image, etc. are added to the KM content 300.
  • edited KM content 300S is generated by changing an image of a play in KM content 300 into an image with a viewpoint set according to the seat position of the user. This allows the user to save the play that he or she actually witnessed as edited KM content 300S with an image from the same viewpoint.
  • the calculation unit 21 performs editing processing on the KM content 300 to change the content content so as to replace an image of a player in an image representing a target play with an image representing the user himself (see Figure 17).
  • an image of a player in a play in the KM content 300 is replaced with the user's avatar 303. This allows the user to obtain edited KM content 300S in which the user himself/herself appears to be participating in the play.
  • the various editing processes described in the embodiment can be combined.
  • the content change based on the viewpoint position of the KM content 300, the addition of the spectator icon 301, the addition of the commemorative icon 302, the addition of the avatar 303, and the content change based on the avatar may all be performed, or some of them may be combined. It is also possible to add a plurality of types of commemorative icons 302 or to add a user's own icon.
  • the content change is not limited to a change in viewpoint or avatar, but may also be a change in the expression of the play itself.
  • the calculation unit 21 determines, in the key moment determination process, a specific type of play determined based on the match information as a target play (see step S103 in FIG. 18 and FIG. 19).
  • the type of target play that will trigger the generation of the KM content 300 is set according to the game. In baseball, for example, a home run, a hit, a stolen base, other scoring plays, a strikeout, a double play, a triple play, a fine play, etc. are specific types of plays.
  • the calculation unit 21 then generates the KM content 300 by detecting these plays from the game progress information. This makes it possible to detect appropriate key moments and generate the KM content 300.
  • the calculation unit 21 determines the occurrence of a target play based on the match information and user information in the key moment determination process (see FIG. 19 ). For example, a specific player or team that the user supports is determined as user information. When a play of that player is detected from the game progress information, it is determined as a target play, or when a specific type of play of the user's supported team is detected, it is determined as a target play. This makes it possible to pick up personal key moments for the user and generate KM content 300 (300S) accordingly. The edited KM content 300S will be filled with more memories of watching the game for the user.
  • the game content is such that the movement of a ball within a game space is set based on tracking data of the ball in a baseball or softball game.
  • the movement of the ball in the game content is expressed based on tracking data of the ball as pitched by the pitcher, tracking data of the batted ball, etc. This makes it possible to provide game content that allows the user to experience balls in baseball or softball.
  • match linkage service has been explained using the example of linking with a baseball game
  • the match linkage service can be applied to a variety of sports. Examples include soccer, American football, rugby, basketball, volleyball, tennis, and golf. Ball games are particularly suitable as sports that use the EPTS system. Of course, it can also be used for sports other than ball games. For example, it can be used for gymnastics, skating, track and field, swimming, judo, kendo, boxing, fencing, etc.
  • the judgment of a play as a key moment, the contents of the KM content 300, the edited KM content 300S, and the contents of the item 203A may be considered according to the sport.
  • a goal scene, a free kick scene, etc. are set as a key moment.
  • Item 203A may be a soccer ball for soccer, a boxing glove for boxing, or a racket for tennis. Match information is also anticipated for each sport.
  • stats data is as follows: ⁇ Goals: The number of goals scored in the season ⁇ Shots: The number of shots taken ⁇ Shots on target: The number of shots taken inside the goal ⁇ Last passes: The number of passes that resulted in goal-scoring opportunities ⁇ One-touch passes: The number of passes made with one touch ⁇ Aerial battles: The number of contests for the ball in the air ⁇ Tackles: The number of attempts to retrieve the ball against the ball carrier ⁇ Distance traveled: The distance traveled during the match for each player
  • key moments include goals, corner kicks, free kicks, penalty kicks, goalkeeper saves, and contact plays.
  • the processing has been mainly described as that of the calculation unit 21 of the main server 2, but all or part of the processing of the calculation unit 21 may be executed by a CPU on the terminal device 5.
  • the match linkage service function 40 of the calculation unit 21 may be installed in the terminal device 5 as an application program, and processing by the item/content generation unit 45, locker processing unit 47, game processing unit 48, etc. in FIG. 3 may be performed on the terminal device 5.
  • the terminal device 5 may be equipped with other functions, such as a match information acquisition unit 41, a user information acquisition unit 42, a match analysis unit 43, a quiz generation unit 44, a memory control unit 46, etc., and processing by these functions may be performed by the terminal device 5.
  • the program of the embodiment is a program that causes, for example, a CPU, a DSP (digital signal processor), an AI processor, or an information processing device 70 including these to execute the processes shown in FIGS.
  • the program of the embodiment is a program that causes an information processing device to execute a determination process for determining whether or not a target play that is to be the subject of generation of KM content 300 has occurred during a competitive match, based on match information, which is information about the match; a generation process for generating KM content 300 related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited KM content 300S in which content has been added to or changed based on user information about the user to whom the service is provided.
  • the information processing device 70 constituting the information provision system 1 of the embodiment can be realized, for example, in a computer device, a mobile terminal device, or other device capable of executing information processing.
  • Such a program can be recorded in advance in a HDD serving as a recording medium built into a device such as a computer device, or in a ROM within a microcomputer having a CPU.
  • the program may be temporarily or permanently stored (recorded) on a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, a memory card, etc.
  • a removable recording medium may be provided as so-called package software.
  • such a program can be installed in a personal computer or the like from a removable recording medium, or can be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
  • LAN Local Area Network
  • Such a program is suitable for a wide range of provision of the information processing device 70 constituting the information provision system 1 of the embodiment.
  • mobile terminal devices such as smartphones and tablets, imaging devices, mobile phones, personal computers, game devices, video devices, PDAs (Personal Digital Assistants), etc.
  • these devices can function as the information processing device 70 constituting the information provision system 1 of the present disclosure.
  • the present technology can also be configured as follows. (1) a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match; a generation process for generating content related to the target play in response to a determination of the occurrence of the target play; an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient; An information processing device comprising a calculation unit that performs the above. (2) The calculation unit is The information processing device described in (1) above, in response to determining that the target play has occurred, generates content related to the target play based on match information and notifies the service destination device that the content can be viewed.
  • the calculation unit is The information processing device described in (1) or (2) above generates a digital item corresponding to edited content to which content has been added or changed based on user information regarding the user to whom the service is provided, and performs a process of associating the edited content and the digital item with the user.
  • the calculation unit is The information processing device according to (3) above, further comprising: a process for playing back the edited content in response to an operation on the digital item.
  • the calculation unit is The information processing device according to any one of (1) to (4) above, wherein the generation process generates content expressing a movement of a person, an animal, or a playing object related to the target play.
  • the calculation unit is The information processing device according to any one of (1) to (5) above, wherein the editing process includes editing the content by adding an image representing that the user has been watching a target play.
  • the calculation unit is The information processing device described in any one of (1) to (6) above, wherein the editing process includes editing the content by adding an image indicating whether the user was watching the target play directly or indirectly to the content as an image representing that the user was watching the target play.
  • the calculation unit is The editing process includes editing the content by adding an image representing that the user was watching the target play based on the user's location information or in response to the user performing a specified operation related to the content.An information processing device described in any of (1) to (7) above.
  • the calculation unit is The information processing device according to any one of (1) to (8) above, wherein the editing process performs editing to add an image expressing user information related to a date and time of occurrence of a target play to the content based on user information.
  • the calculation unit is The information processing device according to any one of (1) to (9) above, wherein the editing process performs editing to add an image representing a viewing location of a target play to the content based on user information.
  • the calculation unit is The information processing device according to any one of (1) to (10) above, wherein the editing process includes editing for adding an image representing the user to the content.
  • the calculation unit is The information processing device according to any one of (1) to (11) above, wherein the editing process performs editing to change content details based on information about a user's viewing location. (13) The calculation unit is The information processing device according to any one of (1) to (12) above, wherein the editing process involves editing to change content details so as to replace an image of a player in an image representing a target play with an image representing the user himself/herself. (14) The calculation unit is The information processing device according to any one of (1) to (13) above, wherein in the determination process, a specific type of play determined based on match information is determined to be the target play.
  • the calculation unit is The information processing device according to any one of (1) to (14) above, wherein the determination process determines the occurrence of the target play based on match information and user information.
  • An information processing device a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match; a generation process for generating content related to the target play in response to a determination of the occurrence of the target play; an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient; An information processing method for performing the above.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This information processing device is provided with a calculation unit that performs: a determination process for determining whether or not a target play for which content is to be generated has occurred on the basis of game information, which is information relating to a game, during a competition game; a generation process for generating content relating to the target play in accordance with the determination of occurrence of the target play; and an editing process for generating edited content that is obtained by editing the generated content on the basis of user information pertaining to a user to whom a service is to be provided.

Description

情報処理装置、情報処理方法、プログラムInformation processing device, information processing method, and program

 本技術は情報処理装置、情報処理方法、プログラムに関し、特に実際の試合に関連してコンテンツを提供するサービスに関する。 This technology relates to information processing devices, information processing methods, and programs, and in particular to services that provide content related to actual matches.

 近年、野球、サッカー、バスケットボールなどの競技の試合中に各種の情報を提供するサービスがある。
 例えば特許文献1には、試合会場で試合の経過やプレー内容の情報をリアルタイムに取得し、センターで集中管理して第三者に提供するシステムが記載されている。
 また特許文献2には、複数の競技場で行われる複数の試合の試合情報を逐次報告することが可能な通信処理システムについて記載されている。
In recent years, there have been services that provide various kinds of information during sports games such as baseball, soccer, and basketball.
For example, Patent Document 1 describes a system that acquires information on the progress of a match and the details of plays at the match venue in real time, centrally manages it at a center, and provides it to third parties.
Furthermore, Patent Document 2 describes a communication processing system capable of sequentially reporting match information of a plurality of matches held in a plurality of stadiums.

特開2002-268806号公報JP 2002-268806 A 特開平8-299525号公報Japanese Patent Application Publication No. 8-299525

 野球、サッカー、バスケットボール等の競技の試合には、多くのファンが観客として競技場に訪れて観戦する。またテレビジョン中継などを視聴する人も多い。スポーツ文化の発展には、このような試合観戦に関し、より観客の満足度を高めたり、楽しみを広げたりすることが望ましい。 Many fans visit stadiums as spectators to watch baseball, soccer, basketball and other sports matches. Many also watch live broadcasts on television. In order to develop sports culture, it is desirable to increase spectator satisfaction and broaden the enjoyment of these types of matches.

 そこで本開示では、試合観戦に付随して観客であるユーザがより楽しみを高めることのできるサービス提供の技術を提案する。 This disclosure therefore proposes technology for providing services that can enhance the enjoyment of watching a match for spectators.

 本技術に係る情報処理装置は、競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、を行う演算部を備えるようにする。
 例えば野球、サッカーなどの競技の試合中に、印象的なプレー、メモリアルなプレー等を、コンテンツ生成対象の対象プレーとして判定する。そのような対象プレーの発生に応じて、コンテンツ生成を行う。またコンテンツに関してはサービス提供先のユーザ情報に基づいて内容的な付加や変更が行われた編集後コンテンツが生成されるようにする。
An information processing device relating to the present technology includes a calculation unit that performs a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information, which is information related to the match; a generation process for generating content related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited content by editing the content based on user information related to the user to whom the service is provided.
For example, during a game of baseball, soccer, or other sport, impressive plays or memorable plays are determined as plays for which content is to be generated. Content is generated in response to the occurrence of such plays. In addition, edited content is generated in which content is added or changed based on user information of the service provider.

本技術の実施の形態の情報提供システムの説明図である。1 is an explanatory diagram of an information providing system according to an embodiment of the present technology; 実施の形態の情報提供システムのブロック図である。1 is a block diagram of an information providing system according to an embodiment; 実施の形態の情報提供システムの機能構成の説明図である。FIG. 2 is an explanatory diagram of a functional configuration of the information providing system according to the embodiment; 実施の形態の情報処理装置のブロック図である。FIG. 1 is a block diagram of an information processing apparatus according to an embodiment. 実施の形態の試合連携サービスによる画面例の説明図である。11 is an explanatory diagram of an example screen of the match linkage service according to the embodiment. FIG. 実施の形態のロッカールーム画面の説明図である。FIG. 13 is an explanatory diagram of a locker room screen according to an embodiment. 実施の形態のキーモーメント通知画面の説明図である。FIG. 13 is an explanatory diagram of a key moment notification screen according to the embodiment. 実施の形態のキーモーメントコンテンツの説明図である。FIG. 2 is an explanatory diagram of key moment content according to an embodiment. 実施の形態のキーモーメントコンテンツの説明図である。FIG. 2 is an explanatory diagram of key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態のコンテンツに対応するアイテムの説明図である。FIG. 2 is an explanatory diagram of an item corresponding to a content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態の編集後キーモーメントコンテンツの説明図である。FIG. 11 is an explanatory diagram of post-edit key moment content according to an embodiment. 実施の形態のコンテンツ提供の処理例のフローチャートである。11 is a flowchart of an example of a content providing process according to an embodiment; 実施の形態のキーモーメント判定のフローチャートである。4 is a flowchart of a key moment determination according to an embodiment. 実施の形態のコンテンツ編集処理例のフローチャートである。11 is a flowchart of an example of a content editing process according to the embodiment. 実施の形態のコンテンツ編集処理例のフローチャートである。11 is a flowchart of an example of a content editing process according to the embodiment. 実施の形態のコンテンツ編集処理例のフローチャートである。11 is a flowchart of an example of a content editing process according to the embodiment. 実施の形態のコンテンツ編集処理例のフローチャートである。11 is a flowchart of an example of a content editing process according to the embodiment. 実施の形態のコンテンツ再生処理例のフローチャートである。11 is a flowchart of an example of a content playback process according to an embodiment.

 以下、実施の形態を次の順序で説明する。
<1.システム構成>
<2.試合連携サービスの画面及び内容例>
<3.処理例>
<4.まとめ及び変形例>
The embodiments will be described below in the following order.
1. System configuration
<2. Screen and content examples of the match linkage service>
<3. Processing example>
4. Summary and Modifications

 なお本開示では「画像」とは、動画、静止画のいずれをも指す。例えば図1に示す撮像装置10は主に動画撮像を行うことが想定されるが、例えば端末装置5で表示される画像は動画の場合や静止画の場合がある。また「画像」とは、実際に画面に表示される画像を指すが、画面に表示されるまでの信号処理過程や伝送路における「画像」とは画像データを指す。
 実施の形態のサービスにおいてコンテンツやデジタルアイテムは主に画像として提供されるが、静止画、動画、少数の静止画による擬似的な動画などがある。これら画像には、音声を含むものも含まないものもある。
In this disclosure, "image" refers to both moving images and still images. For example, the imaging device 10 shown in Fig. 1 is assumed to mainly capture moving images, but the image displayed on the terminal device 5 may be a moving image or a still image. Furthermore, "image" refers to an image that is actually displayed on a screen, but "image" in the signal processing process and transmission path until it is displayed on the screen refers to image data.
In the service of the embodiment, contents and digital items are mainly provided as images, but may also be still images, videos, pseudo-videos made up of a small number of still images, etc. These images may or may not include audio.

<1.システム構成>
 図1に実施の形態の情報提供システム1の概要を示す。
 図1の情報提供システム1は、撮像装置10、メインサーバ2、スコアデータサーバ3、センサ4、端末装置5を備える。これらが相互の有線通信又は無線通信、又はネットワーク通信等を介して接続される。
1. System configuration
FIG. 1 shows an overview of an information providing system 1 according to an embodiment.
1 includes an imaging device 10, a main server 2, a score data server 3, a sensor 4, and a terminal device 5. These are connected to each other via wired or wireless communication, or network communication, etc.

 この情報提供システム1は、端末装置5を有するユーザに対して、試合に連携した仮想空間やデジタルアイテムや各種コンテンツを提供するサービス(以下「試合連携サービス」という)を実現するシステムである。この試合連携サービスでは、ユーザとしてチームや選手のファンなどを想定し、例えばこれらのユーザが観戦する際に、試合に応じて、或いは試合やユーザに応じて生成や更新される仮想空間、画像コンテンツ、ゲームコンテンツ、デジタルアイテム等を端末装置5に提供するものである。
 これによりユーザが、試合自体の観戦の楽しみとともに、試合連携サービスで提供される楽しみも得ることができるようにする。
This information provision system 1 is a system that realizes a service (hereinafter referred to as a "match-linked service") that provides a virtual space, digital items, and various contents linked to a match to a user having a terminal device 5. In this match-linked service, users are assumed to be fans of teams or players, and when these users watch a match, for example, the virtual space, image content, game content, digital items, and the like that are generated or updated according to the match, or according to the match or user are provided to the terminal device 5.
This allows users to enjoy the entertainment provided by the match-linked services in addition to the entertainment of watching the match itself.

 情報提供システム1における複数の撮像装置10は、例えば野球場等における競技場の被写体の領域、例えば試合の様子を多様な位置から撮像している。なお撮像装置10を複数示しているが、少なくとも1つの撮像装置10が設けられればよい。
 撮像装置10は例えば野球であれば、投手、野手、打者、走者の動きの分析や、ボールの軌道、球種、速度、回転等を判定するめの画像を撮像するものとして用いる。
The multiple imaging devices 10 in the information provision system 1 capture images of a subject area of a sports field such as a baseball stadium, for example, the state of a game, from various positions. Although multiple imaging devices 10 are shown, it is sufficient that at least one imaging device 10 is provided.
In the case of baseball, for example, the imaging device 10 is used to capture images for analyzing the movements of the pitcher, fielders, batters, and runners, and for determining the trajectory, type of ball, speed, rotation, and the like.

 情報提供システム1では、このような撮像装置10の撮像画像から、選手の被写体の骨格キャプチャデータを抽出し、骨格キャプチャデータに基づいて選手の動作、例えば投球動作、打撃動作、走塁動作等を判定できるようにしている。また選手に限らず、ボール、バットなどの競技物の動きを判定する場合もある。 In the information provision system 1, skeletal capture data of the subject player is extracted from the image captured by the imaging device 10, and the player's movements, such as pitching movements, batting movements, and base running movements, can be determined based on the skeletal capture data. In addition to players, the movements of playing objects such as balls and bats may also be determined.

 近年、サッカーやバスケットボールなどの競技のプレーに関し、EPTS(Electronic Performance and Tracking Systems)として、専用に設置したカメラによる画像や、競技に関わる人物(選手)や物体(ボール)に付けられたセンサ(加速度センサやGPSセンサ)による情報から、指定したフィールド上から選手・審判の姿勢や位置、ボールの位置/回転などを推定する技術が知られている。 In recent years, a technology known as EPTS (Electronic Performance and Tracking Systems) has become known for use in sports such as soccer and basketball, which estimates the posture and positions of players and referees, as well as the position and rotation of the ball, from a specified field based on images taken by specially installed cameras and information from sensors (accelerometers and GPS sensors) attached to people (players) and objects (balls) involved in the sport.

 具体的には撮像装置10は、骨格キャプチャデータとして、このようなEPTSデータを得るための画像の撮像を行う。
 また撮像装置10による撮像画像は、試合等の実写画像として用いることもできる。
Specifically, the image capturing device 10 captures images for obtaining such EPTS data as skeletal capture data.
Moreover, the images captured by the imaging device 10 can also be used as actual images of a match or the like.

 撮像装置10で撮像された画像に基づいて生成されたEPTSデータは、メインサーバ2に送信される。例えば野球場等の競技場側で複数の撮像装置10の撮像画像を収録し、EPTSデータを生成する情報処理装置(図示せず)が設けられる場合、その情報処理装置で生成されたEPTSデータがメインサーバ2に送信される。
 或いは撮像装置10で得られる撮像画像がメインサーバ2に送信され、メインサーバ2側でEPTSデータを生成するようにしてもよい。
The EPTS data generated based on images captured by the imaging device 10 is transmitted to the main server 2. For example, if an information processing device (not shown) that records images captured by a plurality of imaging devices 10 and generates EPTS data is provided at a sports stadium such as a baseball stadium, the EPTS data generated by the information processing device is transmitted to the main server 2.
Alternatively, the captured image obtained by the imaging device 10 may be transmitted to the main server 2, and the main server 2 may generate the EPTS data.

 センサ4は選手やボール等の動きを検出するセンサである。具体的には、加速度センサ、ジャイロセンサ、GPS(Global Positioning System)センサ、高度センサ、衝撃センサのように選手やボールに付けられたセンサが想定される。例えば選手のユニフォームや身体に装着されたウェアラブルなセンサでもよいし、ボールに内蔵されたセンサでもよい。またレーダーなどでもよい。 Sensor 4 is a sensor that detects the movements of players, the ball, etc. Specifically, it is assumed that this is a sensor attached to the player or the ball, such as an acceleration sensor, gyro sensor, GPS (Global Positioning System) sensor, altitude sensor, or impact sensor. For example, it may be a wearable sensor attached to the player's uniform or body, or it may be a sensor built into the ball. It may also be a radar, etc.

 これらのようなセンサ4によるセンシングデータSSによっても選手の動きの情報を得ることができる。或いは画像から骨格キャプチャデータを得る際や、姿勢等を推定する際に、センシングデータSSを補助的に用いることもできる。またボールの速度、回転数、軌道、バットのスイングスピードや軌道などについてセンサ4によるセンシングデータSSを用いることもできる。
 センシングデータSSはメインサーバ2に送信されてもよいし、或いは野球場等の競技場側でEPTSデータを生成する不図示の情報処理装置に入力されてもよい。
Information on the movement of the player can also be obtained from the sensing data SS by the sensor 4. Alternatively, the sensing data SS can be used as an auxiliary when obtaining skeletal capture data from an image or when estimating posture, etc. The sensing data SS by the sensor 4 can also be used for the speed, number of rotations, and trajectory of the ball, and the swing speed and trajectory of the bat, etc.
The sensing data SS may be transmitted to the main server 2, or may be input to an information processing device (not shown) that generates EPTS data at a sports stadium such as a baseball field.

 スコアデータサーバ3は、例えば野球のスコアデータSDを生成し配信する団体の情報処理装置である。ここではスコアデータサーバ3は、メインサーバ2に対して試合のスコアデータSDを送信できるサーバ装置としている。
 スコアデータサーバ3は、スコアデータSDを、試合中に逐次メインサーバ2に送信してもよいし、例えば試合後にその試合のスコアデータSDの全てを一括して送信してもよい。
The score data server 3 is an information processing device of an organization that generates and distributes, for example, baseball score data SD. Here, the score data server 3 is a server device that can transmit the score data SD of a game to the main server 2.
The score data server 3 may transmit the score data SD to the main server 2 successively during a match, or may transmit all of the score data SD for that match all at once, for example, after the match.

 野球の場合、スコアデータSDとは、次のような試合の各種の情報を含む。
 例えば試合のイニング毎の得点、出場選手の情報などの試合全体の情報だけでなく、投手の投球の1球毎の情報を含む。例えば1球毎に、投球時刻、投手名、打者名、打者の左右別、1塁/2塁/3塁の走者の有無、走者名、ストライク・ボールカウント、ボール/ストライクの別、アウトカウント、打撃結果(安打、空振り、見逃し、ファール、ゴロアウト、フライアウト、邪飛等)、球速、球種、回転数、回転軸、コース、その他の各種の情報を含む。
In the case of baseball, the score data SD includes various information about the game, such as:
For example, it includes not only information about the entire game, such as the score for each inning of the game and information about the players participating, but also information about each pitch thrown by the pitcher. For example, it includes information about each pitch, such as the time of the pitch, pitcher's name, batter's name, whether the batter is left-handed or right-handed, whether there are runners on first/second/third base, runners' names, strike/ball count, ball/strike, out count, batting result (hit, strikeout, missed, foul, grounder out, fly out, foul fly, etc.), ball speed, type of ball, number of rotations, axis of rotation, course, and other various information.

 端末装置5は、例えばスマートフォン、タブレット端末、パーソナルコンピュータなどの情報処理装置であるが、この端末装置5としては、上述のように一般ユーザが所持する装置を想定している。そして端末装置5の表示部5aには、試合連携サービスによって後述するキーモーメントコンテンツや、ロッカールーム画面140のような仮想空間や、デジタルアイテムを含む画像が提示される。 The terminal device 5 is, for example, an information processing device such as a smartphone, tablet terminal, or personal computer, but as described above, it is assumed that this terminal device 5 is a device carried by a general user. The display unit 5a of the terminal device 5 displays key moment content (described below), a virtual space such as the locker room screen 140, and images including digital items through the match linkage service.

 メインサーバ2は、試合連携サービスとしての端末装置5における情報提示のための各種処理を行う。例えば撮像装置10で撮像された画像から生成された被写体の骨格キャプチャデータに基づいて試合や選手の状況を判定し、それに応じたキーモーメントコンテンツや、デジタルアイテムや、仮想空間の画像を端末装置5に提供するなどの処理を行う。 The main server 2 performs various processes for presenting information on the terminal device 5 as a match-linked service. For example, it performs processes such as determining the status of the match and players based on skeletal capture data of the subject generated from the image captured by the imaging device 10, and providing the terminal device 5 with key moment content, digital items, and images of the virtual space accordingly.

 このメインサーバ2としては、クラウドコンピューティングを行う情報処理装置、すなわちクラウドサーバを想定することができる。 This main server 2 can be assumed to be an information processing device that performs cloud computing, i.e., a cloud server.

 但し、端末装置5において試合連携サービスの画像を表示させるための処理をクラウドサーバ以外の情報処理装置で行ってもよい。例えば試合会場に設置されたパーソナルコンピュータ等の情報処理装置がメインサーバ2としての機能を備え、骨格キャプチャデータの取得や提供する画像を生成する処理等を行うことも考えられる。さらには端末装置5がメインサーバ2としての機能を兼ね備え、骨格キャプチャデータの取得や画像生成する処理等を行うことも考えられる。 However, the processing for displaying images of the match linkage service on the terminal device 5 may be performed by an information processing device other than the cloud server. For example, it is conceivable that an information processing device such as a personal computer installed at the match venue has the function of the main server 2 and performs processes such as acquiring skeletal capture data and generating images to be provided. Furthermore, it is conceivable that the terminal device 5 also has the function of the main server 2 and performs processes such as acquiring skeletal capture data and generating images.

 図2は、以上の図1の情報提供システム1において、メインサーバ2の構成と、メインサーバ2に関する入出力の例を示したものである。
 またメインサーバ2にデータを供給する部位をデータサーバ30として示している。
FIG. 2 shows the configuration of the main server 2 and an example of inputs and outputs related to the main server 2 in the information providing system 1 of FIG.
A portion that supplies data to the main server 2 is indicated as a data server 30 .

 なお図2のデータサーバ30は概念的な構成として示しており、メインサーバ2が試合連携サービスの実行のための情報源とする構成を総括して示すものである。例えばデータサーバ30は、撮像装置10、スコアデータサーバ3、センサ4、収録部11、EPTSデータ生成部12を総称している。 Note that the data server 30 in FIG. 2 is shown as a conceptual configuration, and collectively represents the configuration that the main server 2 uses as information source for executing the match linkage service. For example, the data server 30 collectively refers to the imaging device 10, score data server 3, sensor 4, recording unit 11, and EPTS data generation unit 12.

 なお撮像装置10やセンサ4は、図1のように競技場、選手、競技物などに配置される。スコアデータサーバ3については、設置場所は限定されない。
 収録部11、EPTSデータ生成部12は、一方又は両方が、競技場に配置される情報処理装置によって実現されてもよいし、一方又は両方がメインサーバ2側に設けられてもよい。
The imaging device 10 and the sensor 4 are arranged on the stadium, athletes, and sports objects as shown in Fig. 1. The score data server 3 may be installed at any place.
Either or both of the recording unit 11 and the EPTS data generation unit 12 may be realized by an information processing device located at the stadium, or either or both may be provided on the main server 2 side.

 撮像装置10は、例えばCCD(Charge Coupled Devices)センサやCMOS(Complementary Metal-Oxide-Semiconductor)センサ等による撮像素子を有したデジタルカメラ装置として構成され、デジタルデータとしての撮像画像を得る。本例では、各撮像装置10は動画としての撮像画像を得る。 The imaging devices 10 are configured as digital camera devices having imaging elements such as CCD (Charge Coupled Devices) sensors or CMOS (Complementary Metal-Oxide-Semiconductor) sensors, and obtain captured images as digital data. In this example, each imaging device 10 obtains captured images as video.

 各撮像装置10は、例えば野球場では投手、野手、打者、走者等を撮像するためにそれぞれが所定位置に配置されている。撮像装置10の数は1以上であり特に規定されないが、精度の高いEPTSデータの生成のためには、なるべく多い方が有利である。
 各撮像装置10は、同期がとられた状態で動画としての撮像を行い、撮像画像を出力する。
For example, in a baseball field, each imaging device 10 is disposed at a predetermined position to capture images of the pitcher, fielders, batters, runners, etc. The number of imaging devices 10 is not particularly specified and is one or more, but in order to generate highly accurate EPTS data, it is advantageous to have as many imaging devices 10 as possible.
The imaging devices 10 capture images as moving images in a synchronized state, and output the captured images.

 収録部11は、複数の撮像装置10による撮像画像(画像データVD)をそれぞれ収録し、また各撮像画像をEPTSデータ生成部12に供給する。 The recording unit 11 records the images (image data VD) captured by the multiple imaging devices 10, and also supplies each captured image to the EPTS data generation unit 12.

 EPTSデータ生成部12は、1又は複数の撮像画像について解析処理を行い、個別にEPTSデータを生成したうえで、全ての個別のEPTSデータを統合して、全体としてのEPTSデータを生成する。EPTSデータとは、例えば各フレームタイミングでの選手(投手、野手、打者、走者)やボールの位置、選手の骨格キャプチャデータやそれによる選手の姿勢、ボールの回転数や回転方向(回転軸)の情報などを含む。
 またEPTSデータ生成部12は、撮像画像だけでなく、センサ4により得られるセンシングデータSS、例えばボールに埋め込んだ加速度センサや、選手のユニフォームに付けたGPSセンサからの情報を使ってEPTSデータを生成してもよい。
The EPTS data generator 12 performs an analysis process on one or more captured images, generates EPTS data for each image, and then integrates all the individual EPTS data to generate EPTS data as a whole. The EPTS data includes, for example, the positions of players (pitcher, fielder, batter, runner) and the ball at each frame timing, player skeletal capture data and the resulting player posture, and information on the number of rotations and direction of rotation (axis of rotation) of the ball.
In addition, the EPTS data generation unit 12 may generate EPTS data using not only the captured images but also sensing data SS obtained by the sensor 4, such as information from an acceleration sensor embedded in the ball or a GPS sensor attached to the player's uniform.

 このEPTSデータ生成部12によって、試合全体のEPTSデータとして、例えば試合中の投手の全投球のそれぞれについて、投球動作やボールの速度、軌跡、投球に対する打者の挙動や打撃結果、走塁結果、打具の速度、軌跡などを判定できる情報を生成することができる。
 従ってEPTSデータは、試合に関連する人物(投手、打者、走者など)或いは競技物(ボール、バットなど)のいずれかについての挙動情報の例となる。
This EPTS data generation unit 12 can generate EPTS data for the entire game, for example, information that can be used to determine the pitching motion, ball speed, trajectory, batter's behavior in response to the pitch, batting results, base running results, batter's speed and trajectory, etc., for each of the pitchers' pitches during the game.
Thus, EPTS data is an example of behavioral information about either the people (pitchers, batters, runners, etc.) or playing objects (balls, bats, etc.) associated with a game.

 EPTSデータ生成部12は、複数の撮像装置10によって得られる複数の撮像画像からEPTSデータを生成することができるし、1つの撮像装置10によって得られる複数の撮像画像からEPTSデータを生成することもできる。さらにEPTSデータ生成部12は、複数の画像と1又は複数のセンサのセンシングデータSSからEPTSデータを生成することもできるし、1つの撮像画像と1つのセンサのセンシングデータSSからEPTSデータを生成することもできる。 The EPTS data generation unit 12 can generate EPTS data from multiple captured images obtained by multiple imaging devices 10, and can also generate EPTS data from multiple captured images obtained by one imaging device 10. Furthermore, the EPTS data generation unit 12 can generate EPTS data from multiple images and sensing data SS from one or multiple sensors, and can also generate EPTS data from one captured image and sensing data SS from one sensor.

 EPTSデータ生成部12が生成したEPTSデータ(骨格キャプチャデータ)はメインサーバ2に送信される。
 撮像装置10で撮像された画像データVDも例えばEPTSデータ生成部12を介してメインサーバ2に送信される。
 またセンシングデータSSがメインサーバ2に送信されて、そのままメインサーバ2の処理に用いられてもよい。
The EPTS data (skeleton capture data) generated by the EPTS data generation unit 12 is transmitted to the main server 2 .
Image data VD captured by the imaging device 10 is also transmitted to the main server 2 via, for example, an EPTS data generating unit 12 .
The sensing data SS may also be transmitted to the main server 2 and used directly for processing by the main server 2 .

 なお、EPTSデータ生成部12がメインサーバ2に設けられてもよいが、例えばその場合、撮像装置10による撮像画像(画像データVD)やセンサ4のセンシングデータSSが、ネットワーク通信等を介してメインサーバ2におけるEPTSデータ生成部12に送信されれば良い。 The EPTS data generator 12 may be provided in the main server 2. In that case, for example, the captured image (image data VD) by the imaging device 10 and the sensing data SS of the sensor 4 may be transmitted to the EPTS data generator 12 in the main server 2 via network communication or the like.

 また図1で説明したように、スコアデータサーバ3からは、逐次、スコアデータSDがメインサーバ2に送信される。 As explained in FIG. 1, the score data server 3 sequentially transmits the score data SD to the main server 2.

 メインサーバ2は、コンピュータ装置等の情報処理装置により構成され、例えばマイクロプロセッサ等による演算部21、提示制御部22としての機能や、そのマイクロプロセッサ等によりアクセス可能な記憶媒体によるストレージ23が設けられている。 The main server 2 is configured with an information processing device such as a computer device, and is provided with a calculation unit 21 using a microprocessor or the like, a function as a presentation control unit 22, and storage 23 using a storage medium accessible by the microprocessor or the like.

 演算部21は、データサーバ30からの情報取得や、取得した情報を用いた試合連携サービスの実行のための各種の処理、例えばデータ取得、試合解析、仮想空間画像やデジタルアイテムやコンテンツの生成等を実行する。演算部21の処理機能は図3で後述する。 The calculation unit 21 acquires information from the data server 30 and executes various processes for executing the match linkage service using the acquired information, such as data acquisition, match analysis, and generation of virtual space images, digital items, and content. The processing functions of the calculation unit 21 will be described later with reference to FIG. 3.

 ストレージ23は、受信した画像データVD、骨格キャプチャデータBD、スコアデータSD等の情報を記憶したり、演算部21が生成した各種データを記憶したりするメインサーバ2における記憶媒体を示している。
 ストレージ23はメインサーバ2の内部記憶媒体でもよし、メインサーバ2を構成する情報処理装置とは別体の記憶媒体でもよい。
The storage 23 represents a storage medium in the main server 2 for storing information such as received image data VD, skeleton capture data BD, score data SD, etc., and for storing various data generated by the calculation unit 21.
The storage 23 may be an internal storage medium of the main server 2 , or may be a storage medium separate from the information processing device that constitutes the main server 2 .

 提示制御部22は、演算部21の処理により生成された各種画像を、端末装置5において所定の態様で表示させる制御を行う。具体的には提示制御部22は、演算部21が試合連携サービスの実行上で生成した仮想空間やデジタルアイテム等の画像を端末装置5の表示部5aで表示させる処理を行う。 The presentation control unit 22 controls the display of various images generated by the processing of the calculation unit 21 in a predetermined manner on the terminal device 5. Specifically, the presentation control unit 22 performs processing to display images of virtual spaces, digital items, etc., generated by the calculation unit 21 during execution of the match linkage service on the display unit 5a of the terminal device 5.

 以上の演算部21、提示制御部22、ストレージ23は、1つの情報処理装置内に設けられても良いが、複数の情報処理装置に分かれて設けられてもよい。またストレージ23をどのような記憶媒体で構成するかの実現態様は多様に考えられる。 The calculation unit 21, presentation control unit 22, and storage 23 may be provided in one information processing device, or may be provided separately in multiple information processing devices. There are various possible ways to realize the storage 23 as a storage medium.

 このようなメインサーバ2の処理による試合連携サービスの画像又は画像生成のためのデータは、端末装置5に送信され、表示部5aで表示される。これにより端末装置5を所持するユーザは、試合連携サービスによる提供画像を視認できる。
 また端末装置5からは、試合連携サービスの提供中におけるユーザインタラクションの情報が演算部21に入力されることもある。
Such images of the match linkage service or data for generating images processed by the main server 2 are transmitted to the terminal device 5 and displayed on the display unit 5a. This allows a user who owns the terminal device 5 to visually recognize images provided by the match linkage service.
Furthermore, information on user interaction during the provision of the match linkage service may be input to the calculation unit 21 from the terminal device 5 .

 図3は、データサーバ30から演算部21に送信される情報と、演算部21の処理機能を例示している。特にこの図では、試合連携サービス機能(game linked service function)40についての処理機能と、試合連携サービスによる結果物を出品できる共有サービスとしての処理機能を例示している。 FIG. 3 illustrates information transmitted from the data server 30 to the calculation unit 21, and the processing functions of the calculation unit 21. In particular, this figure illustrates the processing functions for the game linked service function 40, and the processing functions as a shared service that allows results of the game linked service to be exhibited.

 なお、本実施の形態の演算部21は、試合連携サービス機能40の処理機能を備える。図のように演算部21が共有サービス機能50の処理機能を備えてもよいが、共有サービス機能50は他のサーバ装置が備えるようにしてもよい。 In addition, the calculation unit 21 in this embodiment has a processing function of the match linkage service function 40. As shown in the figure, the calculation unit 21 may have a processing function of the shared service function 50, but the shared service function 50 may also be provided by another server device.

 上述したデータサーバ30によって演算部21に提供される情報をまとめると、図示のように、EPTSデータ、センシングデータSS、画像データVD、スタッツデータ、ヒストリカルデータ、解析データ等がある。 The information provided to the calculation unit 21 by the data server 30 described above can be summarized as follows, as shown in the figure: EPTS data, sensing data SS, image data VD, stats data, historical data, analysis data, etc.

 スタッツデータ、ヒストリカルデータとは、スコアデータサーバ3によるスコアデータSDである。
 例えばスタッツデータとは、選手やチームの成績情報である。例えば野球の投手の場合のスタッツデータの例を挙げると、チーム名、所属リーグ、勝利数、敗北数、防御率、登板試合数、完投数、完封数、セーブ数、セーブ機会数、投球回数、失点数、自責点数、被本塁打数、与死球数、与四球数、敬遠・故意四球数、奪三振数、被安打率、1投球回あたりに出した走者数(WHIP:Walks plus Hits per Inning Pitched)、ゴロ/フライアウト比率などがある。
 また、チーム、選手の過去の成績だけでなく、現在進行中の試合の現時点までのチームや各選手の成績の情報、即ち詳細な試合経過の情報、出場登録選手の情報、スターティングメンバーの情報、交代選手の情報等もスタッツデータに含まれる。試合経過の情報には、例えば投球内容、打撃結果、守備、走塁などの各プレーの情報も含まれる。
The stats data and historical data are score data SD from the score data server 3.
For example, stats data is information on the performance of a player or team. Examples of stats data for a baseball pitcher include the team name, league, number of wins, number of losses, earned run average, number of games pitched, number of complete games, number of shutouts, number of saves, number of save opportunities, number of innings pitched, number of runs allowed, number of home runs allowed, number of hit by pitch, number of walks allowed, number of intentional walks, number of strikeouts, hit rate, number of runners allowed per inning pitched (WHIP: Walks plus Hits per Inning Pitched), and ground ball/fly out ratio.
In addition to the past performance of the team and the players, the stats data also includes information on the performance of the team and each player up to the present time of the game currently in progress, that is, detailed information on the progress of the game, information on the registered players, information on the starting members, information on substitute players, etc. Information on the progress of the game also includes information on each play, such as pitching details, batting results, defense, and base running.

 ヒストリカルデータもチームや選手の成績情報であり、広義のスタッツデータに含まれるが、ここでは説明上、過去の最高記録、球団記録、個人記録などの情報として示している。例えば現在の試合で達成される可能性のある記録の情報を含む。例えば過去の1試合での奪三振数の記録、連続打席安打の記録、連続試合出場の記録、1試合のチーム安打数の記録などがある。このようなヒストリカルデータは、試合連携サービスにおいて特別なデジタルアイテム編集のために参照できる情報となる。 Historical data is also information about the performance of teams and players, and is included in stats data in a broad sense, but for the sake of explanation here it is shown as information such as past best records, team records, and individual records. For example, it includes information on records that may be achieved in the current game. For example, records of the number of strikeouts in a single game in the past, records of consecutive hits at bats, records of consecutive games played, and records of the number of team hits in a single game. Such historical data is information that can be referenced for editing special digital items in the game linkage service.

 解析データは、EPTSデータ、センシングデータSS、画像データVDについてデータ解析部32としての機能により判定されたデータを示している。
 このデータ解析部32の機能は、メインサーバ2に設けられてもよいし、他の情報処理装置により実現されてもよい。
The analysis data indicates data determined by the function of the data analysis unit 32 regarding the EPTS data, the sensing data SS, and the image data VD.
The function of the data analysis unit 32 may be provided in the main server 2, or may be realized by another information processing device.

 解析データの例としては具体的には、EPTSデータやセンシングデータSS等から判定される選手の特定の動作、ボールの軌跡等が想定される。例えば試合における選手や競技物の動きのトラッキングデータである。
 選手の動きとしては、野球の場合は、投手の投球動作、捕手の捕球動作、打者の打撃動作、野手の守備動作、走者の走塁動作などがある。各種の競技における選手の動きの全般をトラッキングデータとして得ることができる。
Specific examples of the analysis data include a specific action of a player, a trajectory of a ball, etc., determined from EPTS data, sensing data SS, etc. For example, tracking data of the movement of players and playing objects in a game.
In baseball, the movements of players include the pitcher's pitching motion, the catcher's catching motion, the batter's batting motion, the fielders' defensive motion, the runners' running motion, etc. The overall movements of players in various sports can be obtained as tracking data.

 競技物とは、試合に使用される物体であり、球技の場合のボールや、バットやラケットなどボールを打つ物がある。また剣道の竹刀、フェンシングのサーベル等も競技物に含まれる。さらに競技に使用される補助的な物体、例えばプロテクター、ユニフォームなども競技物に含まれる。
 これら競技物の動きのトラッキングデータも解析データとして得ることができる。
Competition items are objects used in a match, such as balls in ball games, and objects used to hit the ball, such as bats and rackets. Other examples of competition items include bamboo swords in kendo and sabres in fencing. Furthermore, auxiliary objects used in the sport, such as protectors and uniforms, are also included in the category of competition items.
Tracking data of the movements of these playing objects can also be obtained as analysis data.

 これらデータサーバ30から演算部21に供給されるEPTSデータ、センシングデータSS、画像データVD、スタッツデータ、ヒストリカルデータ、解析データは、試合に関する情報であり、試合情報と呼ぶこととする。
 この試合情報とは、内容的には、試合自体に関する情報や、試合を行う選手やチームの情報を指すものとして、後述のユーザ情報と区別している。
The EPTS data, sensing data SS, image data VD, stats data, historical data, and analysis data supplied from the data server 30 to the calculation unit 21 are information related to the match, and will be referred to as match information.
This match information, in terms of content, refers to information about the match itself and information about the players and teams playing in the match, and is distinguished from user information, which will be described later.

 試合自体に関する情報とは、試合日時、会場、天候、スコア、試合の進行に関する情報(現在のイニング、アウトカウント)などがある。
 選手の情報とは、各選手の試合での成績、過去の成績、試合の画像、プレー内容、選手の動きの情報など、選手個人毎の情報がある。
 チームの情報とは、チームの試合での成績、過去の成績、試合の画像、出場選手、控え選手など、チーム全体の情報である。
The information about the game itself includes the game date and time, the venue, the weather, the score, and information about the progress of the game (current inning, out count, etc.).
The player information includes information about each individual player, such as the player's performance in the match, past performance, images of the match, details of the play, and information about the player's movements.
Team information refers to information about the entire team, such as the team's match results, past results, game images, participating players, and substitute players.

 なお、これらの「試合情報」となるEPTSデータ、センシングデータSS、画像データVD、スタッツデータ、ヒストリカルデータ、解析データは、それらの全てが演算部21に送信されるものとする必要はない。例えばこれらのうち少なくとも1つが演算部21に送信されることで、演算部21は試合連携サービス機能40のための処理を行うことができる。
 またEPTSデータや解析データは、演算部21が画像データVDやセンシングデータSSに基づいて生成してもよい。
It should be noted that it is not necessary for all of the "match information" such as the EPTS data, the sensing data SS, the image data VD, the stats data, the historical data, and the analysis data to be transmitted to the calculation unit 21. For example, by transmitting at least one of these to the calculation unit 21, the calculation unit 21 can perform processing for the match linkage service function 40.
In addition, the EPTS data and the analysis data may be generated by the calculation unit 21 based on the image data VD and the sensing data SS.

 また端末装置5から演算部21に入力される情報としてユーザ情報や購入/出品/シェア/交換についての情報がある。 Information input from the terminal device 5 to the calculation unit 21 includes user information and information regarding purchases/listing/sharing/exchanges.

 ここで試合情報、ユーザ情報について説明しておく。
 本開示でいう「試合情報」については、その情報ソースとして次の2通りがある。
・スコアデータサーバ3から提供される情報
・撮像装置10やセンサ4からの信号を元として得られる情報
 この2種類の情報ソースから得られる情報の内容は異なる物もあれば同じものもある。
Here, the match information and user information will be explained.
Regarding the "match information" referred to in this disclosure, there are two sources of information:
Information provided by the score data server 3. Information obtained based on signals from the imaging device 10 and the sensor 4. The content of the information obtained from these two types of information sources may be different or the same.

 試合情報の内容的な種別を考える場合に、本実施の形態では「試合固定情報」「試合経過情報」に分類する。 When considering the content types of match information, in this embodiment it is classified into "fixed match information" and "match progress information."

 試合固定情報は、試合経過によっては変化しない情報とする。
 例えば開催地、試合開始日時、季節、チーム、スターティングメンバー、試合開始時の天候や気温等の気象情報、試合前までのチーム成績、ベンチ入り選手、試合前までの選手やチームの成績(スタッツデータ、ヒストリカルデータ)、審判、試合の種別(公式戦、優勝決定戦、優勝マジック点灯試合等)、試合前の選手のコンディション、選手のプロフィール(誕生日、出身地等)等がある。
Fixed match information is information that does not change as the match progresses.
For example, information such as the venue, start date and time of the game, season, team, starting members, weather information such as the weather and temperature at the start of the game, team performance before the game, players on the bench, player and team performance before the game (stats data, historical data), referee, type of game (official game, championship game, magic number game to win, etc.), player condition before the game, player profile (birthday, hometown, etc.), etc.

 試合経過情報は、試合の経過で得られる情報とする。
 例えば試合のスコア(試合経過/結果)、進行イニング/時間/ハーフ、試合中の気候変化、試合進行に伴った選手やチームの成績(スタッツデータ、ヒストリカルデータ)、出場選手、試合中のそれぞれのプレーの情報、各プレーでの選手の行動(EPTS情報/センシングデータSS/解析データ)、試合の画像データVD、達成記録等がある。
Match progress information is information obtained as the match progresses.
For example, there is the score of the game (game progress/result), innings/time/half in progress, weather changes during the game, player and team performance as the game progresses (stats data, historical data), participating players, information on each play during the game, player behavior during each play (EPTS information/sensing data SS/analysis data), image data VD of the game, achievement records, etc.

 ユーザ情報としては、メインサーバ2が対象のユーザ(試合連携サービス機能40にログインしているユーザ)について取得できる情報を指す。
 ここではユーザ情報の内容的な種別を考え「ユーザ固定情報」「ユーザ状況情報」に分類する。
The user information refers to information that the main server 2 can obtain about the target user (the user who is logged in to the match linkage service function 40).
Here, the user information is classified into "user fixed information" and "user status information" in terms of content type.

 ユーザ固定情報は、今回の試合(ログインしているユーザが観戦している試合)によっては変化しないユーザ情報とする。
 例えばユーザのアカウント情報、アドレス情報、属性情報、試合連携サービスに関する履歴情報、所有アイテムの情報などがある。属性情報としては、年齢、性別、居住地、メールアドレスなどの基本的な情報や、さらには試合連携サービスに関連して応援しているチームや選手を示す情報などが含まれても良い。
 また逐次、ユーザ固定情報に追加される情報としてスタジアムで観戦した場合の日時、スタジアム名、座席の情報などもある。
 なおユーザ固定情報の一部や全部は、例えば試合連携サービス機能40に登録したユーザについて予め入力し、ストレージ23等に記憶しておいてもよい。
The user fixed information is user information that does not change depending on the current match (the match that the logged-in user is watching).
For example, the information may include user account information, address information, attribute information, history information related to the match linkage service, information on owned items, etc. The attribute information may include basic information such as age, sex, place of residence, and email address, as well as information indicating a team or player that the user supports in relation to the match linkage service.
In addition, when a game is watched at a stadium, information such as the date and time, the name of the stadium, and seat information is added to the user fixed information successively.
It should be noted that some or all of the user fixed information may be input in advance for a user registered with the match linkage service function 40, and stored in the storage 23 or the like.

 ユーザ状況情報は、今回の試合や試合連携サービスに対するユーザの状況を示す情報である。例えばユーザの位置に関する情報、ユーザの行動やインタラクションに応じた情報等を含む。
 ユーザの位置に関する情報としては、例えばユーザの現在位置情報、ユーザが現在居る施設や店舗の情報等がある。スタジアムにおける座席情報、エリア情報などを含んでもよい。
 ユーザの行動やインタラクションに応じた情報としては、ユーザの移動状況、発声、アクション、ユーザ(又は観客全体)の歓声の情報、試合連携サービスによって提供されたクイズ等に対する回答入力や、今回の試合についてユーザが入力する応援するチームや選手を示す情報、試合中にユーザが入力する情報(例えば試合連携サービスが提供するクイズについての操作情報、参加情報、回答情報等)などがある。
The user situation information is information that indicates the situation of the user with respect to the current match or the match-linked service, and includes, for example, information related to the user's location, and information according to the user's actions and interactions.
The information relating to the user's location includes, for example, the user's current location information, information about the facility or store where the user is currently located, etc. It may also include seat information in a stadium, area information, etc.
Information corresponding to the user's behavior and interaction includes the user's movement status, vocalizations, actions, cheers from the user (or the entire audience), answers to quizzes provided by the match collaboration service, information entered by the user regarding the current match indicating the team or player they are supporting, and information entered by the user during the match (for example, operation information, participation information, answer information, etc. regarding the quizzes provided by the match collaboration service).

 なおユーザの行動やインタラクションの履歴の情報は、逐次、履歴情報として、ユーザ固定情報に追加されるものとしてもよい。 In addition, information on the user's actions and interaction history may be added sequentially to the user's fixed information as history information.

 演算部21は、試合連携サービス機能40の処理に関し、以上のような試合情報(試合固定情報、試合経過情報)、及びユーザ情報(ユーザ固定情報、ユーザ状況情報)を入力する。 The calculation unit 21 inputs the above-mentioned match information (fixed match information, match progress information) and user information (fixed user information, user status information) for processing by the match linkage service function 40.

 演算部21において試合連携サービス機能40の実行のための処理機能として、試合情報取得部41、ユーザ情報取得部42、試合解析部43、クイズ生成部44、アイテム/コンテンツ生成部45、記憶制御部46、ロッカー処理部47、ゲーム処理部48を示している。なお演算部21がこれらの処理機能の全てを備える必要はない。 The processing functions for executing the match linkage service function 40 in the calculation unit 21 include a match information acquisition unit 41, a user information acquisition unit 42, a match analysis unit 43, a quiz generation unit 44, an item/content generation unit 45, a memory control unit 46, a locker processing unit 47, and a game processing unit 48. Note that the calculation unit 21 does not need to have all of these processing functions.

 試合情報取得部41は試合情報(試合固定情報、試合経過情報)を取得する機能である。例えば試合開始前、及び試合中に逐次試合情報をデータサーバ30から取得する処理を行う。 The match information acquisition unit 41 is a function for acquiring match information (fixed match information, match progress information). For example, it performs processing to acquire match information from the data server 30 before the start of the match and sequentially during the match.

 ユーザ情報取得部42はユーザ情報(ユーザ固定情報、ユーザ状況情報)を取得する機能である。例えばログインユーザについて試合開始前や試合中に逐次ユーザ情報を取得する。ログイン時に、ストレージ23に記憶してある当該ユーザのユーザ固定情報を読み出す処理も行う。 The user information acquisition unit 42 is a function for acquiring user information (user fixed information, user status information). For example, it acquires user information for a logged-in user before the start of a match or sequentially during a match. At the time of login, it also performs processing to read out the user fixed information for that user stored in the storage 23.

 試合解析部43は試合情報やユーザ情報に基づいて試合に関する解析を行い、試合の状況を判定する処理を行う。また試合解析部43は試合状況の判定に応じて、サービスに関してユーザの対応を求めるタイミングの判定をしたり、デジタルアイテムの編集やコンテンツの提供のタイミングを判定したりする処理を行う。 The match analysis unit 43 performs processing to analyze the match based on match information and user information and determine the match situation. In addition, the match analysis unit 43 performs processing to determine the timing to request a user response regarding the service, and the timing to edit digital items and provide content, depending on the determination of the match situation.

 クイズ生成部44は、ユーザに提供するクイズやアンケートを生成し、端末装置5に送信する処理を行う。クイズ生成部44は、さらにクイズやアンケート以外にユーザインタラクションを求めるための処理を行うこともできる。 The quiz generation unit 44 generates quizzes and questionnaires to be provided to the user and transmits them to the terminal device 5. The quiz generation unit 44 can also perform processing to request user interaction in addition to quizzes and questionnaires.

 アイテム/コンテンツ生成部45は、デジタルアイテムの生成を行う。またアイテム/コンテンツ生成部45は、試合のダイジェスト動画、自由視点画像、CG(computer graphics)画像などの画像コンテンツなどの生成処理を行う。またアイテム/コンテンツ生成部45は、最終的にデジタルアイテムに関連づける情報の生成などの処理も行う。またアイテム/コンテンツ生成部45は後述するポイントの生成、付与の処理も行う。 The item/content generation unit 45 generates digital items. The item/content generation unit 45 also generates image content such as digest videos of matches, free viewpoint images, and CG (computer graphics) images. The item/content generation unit 45 also generates information that is ultimately associated with digital items. The item/content generation unit 45 also generates and awards points, which will be described later.

 本実施の形態においては、特にアイテム/コンテンツ生成部45は、判定部45a、生成編集部45b、提供処理部45cとしての機能を備える。 In this embodiment, the item/content generation unit 45 in particular has the functions of a determination unit 45a, a generation and editing unit 45b, and a provision processing unit 45c.

 判定部45aは、競技の試合中に、試合情報に基づいて、キーモーメントコンテンツ生成対象となる対象プレーが発生したか否かを判定する判定処理を行う。試合中のプレーとして感動的なプレー、印象的なプレー、エキサイトするプレー、観客が盛り上がるプレー、何らかの記録達成プレーなどが発生した瞬間を、本開示では「キーモーメント」と呼ぶ。そしてキーモーメントのプレーをCGや撮像画像を用いて画像コンテンツとしたものがキーモーメントコンテンツである。判定部45aは、キーモーメントの発生を判定する処理とも言える。キーモーメントと判定されたプレーを「対象プレー」とする。 The determination unit 45a performs a determination process to determine whether or not a target play that is a target for generating key moment content has occurred during a competitive match, based on the match information. In this disclosure, a "key moment" refers to a moment during a match when a moving play, a memorable play, an exciting play, a play that gets the crowd excited, a play that achieves some kind of record, etc. occurs. Key moment content is when a key moment play is made into image content using CG or captured images. The determination unit 45a can also be considered a process to determine the occurrence of a key moment. A play that is determined to be a key moment is the "target play."

 例えば判定部45aは、試合情報に基づいて判定される特定種別のプレーを対象プレーと判定する。また判定部45aは、試合情報とユーザ情報に基づいて対象プレーの発生を判定することもできる。 For example, the determination unit 45a determines that a specific type of play determined based on the match information is a target play. The determination unit 45a can also determine the occurrence of a target play based on the match information and user information.

 生成編集部45bは、キーモーメント判定に応じて、対象プレーに関するコンテンツ(キーモーメントコンテンツ)を生成する生成処理を行う。
 また生成編集部45bは、キーモーメントコンテンツについて、サービス提供先のユーザに関するユーザ情報に基づいて内容を付加又は変更した編集後キーモーメントコンテンツを生成する編集処理を行う。
The generating and editing unit 45b performs a generating process to generate content (key moment content) related to a target play in response to a key moment determination.
Furthermore, generating/editing unit 45b performs an editing process for generating edited key moment content by adding or modifying content based on user information relating to the user to whom the service is provided.

 提供処理部45cは、コンテンツの提供に関する処理を行う。例えば提供処理部45cは、キーモーメントコンテンツを生成した場合に、サービス提供先の端末装置5にコンテンツ視聴可能であることを通知する処理を行う。
 また提供処理部45cは、サービス提供先のユーザに関するユーザ情報に基づいて内容の付加又は変更を行った編集後キーモーメントコンテンツに対応するデジタルアイテムを生成し、編集後キーモーメントコンテンツとデジタルアイテムをユーザに関連づける処理を行う。
 また提供処理部45cは、デジタルアイテムの操作に応じて編集後コンテンツを再生させる処理を行う。
The provision processing unit 45c performs processing related to the provision of content. For example, when the provision processing unit 45c generates key moment content, the provision processing unit 45c performs processing to notify the service destination terminal device 5 that the content is available for viewing.
The provision processing unit 45c also generates a digital item corresponding to the edited key moment content to which content has been added or changed based on user information regarding the user to whom the service is provided, and performs processing to associate the edited key moment content and the digital item with the user.
The providing processing unit 45c also performs processing for playing back edited content in response to an operation on a digital item.

 記憶制御部46は試合連携サービス機能40に関しての、ストレージ23に対する情報の記憶及び読出の処理を行う。 The storage control unit 46 performs the process of storing and reading information in the storage 23 regarding the match linkage service function 40.

 ロッカー処理部47は、一般ユーザや選手の仮想的なロッカーの画像(後述するロッカールーム画面140)を含む仮想空間としてのクラブハウスの提供に関する処理を行う機能である。ロッカールーム画面140としてユーザに提供されるロッカーの設定や態様の変更のための処理を行う。 The locker processing unit 47 is a function that performs processing related to providing the clubhouse as a virtual space that includes images of virtual lockers for general users and players (locker room screen 140 described later). It performs processing for changing the settings and appearance of the lockers provided to the user as the locker room screen 140.

 ゲーム処理部48は、端末装置5の画面上で提供されるゲームコンテンツに関する処理を行う機能である。このゲームコンテンツは、例えばアプリケーションプログラムやブラウザ画面などにより提供されるゲームであり、ゲーム内容の種別は多様である。
 但し、このゲームは、実際の試合情報やユーザ情報に連動した変化を生じさせる機能を有するものとする。
 例えば試合情報やユーザ情報、さらにはロッカールーム画面140に関係するデジタルアイテムなどに応じて、ゲーム中のパラメータやゲームアイテムが変化されるなどである。
The game processing unit 48 is a function that performs processing related to game content provided on the screen of the terminal device 5. This game content is, for example, a game provided by an application program or a browser screen, and the game content types are diverse.
However, this game has the functionality to bring about changes linked to actual match information and user information.
For example, in-game parameters and game items may be changed depending on match information, user information, and digital items related to the locker room screen 140.

 メインサーバ2の演算部21については、以上のような試合連携サービス機能40に加えて、共有サービス機能50も示している。
 この共有サービス機能50としては、NFT(Non-Fungible Token)マーケットとしての機能を含む場合がある。
 NFTマーケット機能とは、ユーザが取得したNFT作品を出品販売したり、或いは出品されているNFT作品を購入したりできるサービスである。
 例えば本実施の形態の場合、試合連携サービス機能40で生成されるデジタルアイテムは試合の経過及びユーザインタラクションによって編集されていくものであり、最終的にはユーザ固有のデジタルアイテムとなる。これをNFT作品として共有サービス機能50において出品したり購入したりすることができるようにする。
The calculation unit 21 of the main server 2 also has a shared service function 50 in addition to the match linkage service function 40 described above.
This shared service function 50 may include a function as an NFT (Non-Fungible Token) market.
The NFT market function is a service that allows users to list and sell NFT works that they have acquired, or purchase NFT works that are being listed.
For example, in the case of this embodiment, the digital items generated by the match linking service function 40 are edited according to the progress of the match and user interaction, and ultimately become digital items unique to the user. These can then be put up for sale or purchased as NFT works in the sharing service function 50.

 このため共有サービス機能50として、スマートコントラクト機能51や決済機能52が備えられる。
 スマートコントラクト機能51は、売買取引について、所定条件に応じて所定の売買処理を自動的に実行する機能である。
 決済機能52は、売買に関しての決済処理を行う機能である。
For this reason, the shared service function 50 is provided with a smart contract function 51 and a payment function 52.
The smart contract function 51 is a function that automatically executes predetermined buying and selling processing in accordance with predetermined conditions for buying and selling transactions.
The settlement function 52 is a function for carrying out settlement processing regarding buying and selling.

 また共有サービス機能50としては、シェアサービス機能53や交換サービス機能54を備える場合がある。 The shared service function 50 may also include a sharing service function 53 and an exchange service function 54.

 シェアサービス機能53は、例えばユーザが所有するアイテム、コンテンツ等を無償で他人にシェアできるようにする機能である。 The sharing service function 53 is a function that allows a user to share items, content, etc. that they own with others free of charge.

 交換サービス機能54は、例えばユーザが所有するアイテム、コンテンツ等を、他人が所有するアイテム、コンテンツ等と交換できるようにする機能である。 The exchange service function 54 is a function that allows a user to exchange items, content, etc. owned by the user for items, content, etc. owned by other people.

 このような共有サービス機能50に対して、試合連携サービス機能40は何らかのデジタルアイテムを出品してユーザが購入可能とすることができる。 The match linkage service function 40 can offer some kind of digital item to such a sharing service function 50 and make it available for purchase by users.

 またユーザは試合連携サービス機能40により自分が取得したデジタルアイテムを出品したり、他人が出品したデジタルアイテムを購入したり、他人にシェアしたり、他人のデジタルアイテムと交換したりすることができる。 In addition, the match linkage service function 40 allows users to sell digital items they have acquired, purchase digital items sold by others, share them with others, and exchange them for other people's digital items.

 なお図3の機能構成は一例である。
 試合連携サービス機能40としての機能の一部は端末装置5側で処理することも可能である。
 例えばクイズ生成部44、試合解析部43、アイテム/コンテンツ生成部45、ゲーム処理部48の一部又は全部の機能は、試合連携サービス機能40の外部で実行することも考えられる。
 換言すれば、演算部21の一部機能は端末装置5側に設けられてもよい。例えばアイテム/コンテンツ生成部45を備える演算部が端末装置5に設けられるなどである。
The functional configuration in FIG. 3 is just an example.
Some of the functions of the match linkage service function 40 can also be processed on the terminal device 5 side.
For example, some or all of the functions of the quiz generation unit 44 , the match analysis unit 43 , the item/content generation unit 45 , and the game processing unit 48 may be executed outside the match linkage service function 40 .
In other words, some of the functions of the calculation unit 21 may be provided on the side of the terminal device 5. For example, a calculation unit including the item/content generation unit 45 may be provided on the terminal device 5.

 また共有サービス機能50は設けなくて良い。例えば試合連携サービス機能40が、生成したデジタルアイテム等をユーザに販売するような処理も考えられる。 Furthermore, the sharing service function 50 does not need to be provided. For example, the match linkage service function 40 may be configured to sell the generated digital items to users.

 以上の図1,図2,図3で説明した情報提供システム1で用いられる情報処理装置70の構成を説明する。例えば、メインサーバ2、端末装置5や、図2におけるEPTSデータ生成部12等は、図4に示す情報処理装置70により実現できる。
 そして情報処理装置70は、例えば専用のワークステーションや、汎用のパーソナルコンピュータ、モバイル端末装置等として構成することができる。
The following describes the configuration of an information processing device 70 used in the information provision system 1 described above with reference to Figures 1, 2, and 3. For example, the main server 2, the terminal device 5, and the EPTS data generator 12 in Figure 2 can be realized by the information processing device 70 shown in Figure 4.
The information processing device 70 can be configured as, for example, a dedicated workstation, a general-purpose personal computer, a mobile terminal device, or the like.

 図4に示す情報処理装置70のCPU71は、ROM72や例えばEEP-ROM(Electrically Erasable Programmable Read-Only Memory)などの不揮発性メモリ部74に記憶されているプログラム、または記憶部79からRAM73にロードされたプログラムに従って各種の処理を実行する。RAM73にはまた、CPU71が各種の処理を実行する上において必要なデータなども適宜記憶される。 The CPU 71 of the information processing device 70 shown in FIG. 4 executes various processes according to programs stored in a ROM 72 or a non-volatile memory unit 74, such as an EEPROM (Electrically Erasable Programmable Read-Only Memory), or programs loaded from a storage unit 79 to a RAM 73. The RAM 73 also stores data necessary for the CPU 71 to execute various processes, as appropriate.

 画像処理部85は各種の画像処理を行うプロセッサとして構成される。例えばビデオクリップ等の画像生成処理、撮像画像等に対する画像解析処理、アニメーション画像やCG画像の生成処理、DB(Data Base)処理、画像エフェクト処理、EPTSデータ生成処理などのいずれかを行うことができるプロセッサとされる。 The image processing unit 85 is configured as a processor that performs various types of image processing. For example, it is a processor that can perform any of the following: image generation processing for video clips, image analysis processing for captured images, generation processing for animation images and CG images, DB (Data Base) processing, image effect processing, EPTS data generation processing, etc.

 この画像処理部85は例えば、CPU71とは別体のCPU、GPU(Graphics Processing Unit)、GPGPU(General-purpose computing on graphics processing units)、AI(artificial intelligence)プロセッサ等により実現できる。
 なお画像処理部85はCPU71内の機能として設けられてもよい。
This image processing unit 85 can be realized, for example, by a CPU separate from the CPU 71, a graphics processing unit (GPU), a general-purpose computing on graphics processing units (GPGPU), an artificial intelligence (AI) processor, or the like.
The image processing unit 85 may be provided as a function within the CPU 71 .

 CPU71、ROM72、RAM73、不揮発性メモリ部74、画像処理部85は、バス83を介して相互に接続されている。このバス83にはまた、入出力インタフェース75も接続されている。 The CPU 71, ROM 72, RAM 73, non-volatile memory unit 74, and image processing unit 85 are interconnected via a bus 83. The input/output interface 75 is also connected to this bus 83.

 入出力インタフェース75には、操作子や操作デバイスよりなる入力部76が接続される。例えば入力部76としては、キーボード、マウス、キー、ダイヤル、タッチパネル、タッチパッド、リモートコントローラ等の各種の操作子や操作デバイスが想定される。
 入力部76によりユーザの操作が検知され、入力された操作に応じた信号はCPU71によって解釈される。
An input unit 76 including an operator or an operating device is connected to the input/output interface 75. For example, the input unit 76 may be various operators or operating devices such as a keyboard, a mouse, a key, a dial, a touch panel, a touch pad, or a remote controller.
An operation by the user is detected by the input unit 76 , and a signal corresponding to the input operation is interpreted by the CPU 71 .

 また入出力インタフェース75には、LCD(Liquid Crystal Display)或いは有機EL(Electro-Luminescence)パネルなどよりなる表示部77や、スピーカなどよりなる音声出力部78が一体又は別体として接続される。 The input/output interface 75 is also connected, either integrally or separately, to a display unit 77 such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) panel, and an audio output unit 78 such as a speaker.

 表示部77はユーザインタフェースとして各種表示を行う。表示部77は例えば情報処理装置70の筐体に設けられるディスプレーデバイスや、情報処理装置70に接続される別体のディスプレーデバイス等により構成される。
 表示部77は、CPU71の指示に基づいて表示画面上に各種の画像表示を実行する。また表示部77はCPU71の指示に基づいて、各種操作メニュー、アイコン、メッセージ等、即ちGUI(Graphical User Interface)としての表示を行う。
The display unit 77 performs various displays as a user interface. The display unit 77 is, for example, a display device provided in the housing of the information processing device 70, or a separate display device connected to the information processing device 70.
The display unit 77 executes various image displays on the display screen based on instructions from the CPU 71. The display unit 77 also displays various operation menus, icons, messages, etc., that is, GUIs (Graphical User Interfaces), based on instructions from the CPU 71.

 例えばこの情報処理装置70を端末装置5として考える場合、表示部77は表示部5aに相当し、後述するキーモーメントコンテンツやデジタルアイテムを含む画面などの試合連携サービス機能40による画面表示を行う。 For example, if the information processing device 70 is considered as a terminal device 5, the display unit 77 corresponds to the display unit 5a, and displays a screen using the match linkage service function 40, such as a screen including key moment content and digital items, which will be described later.

 入出力インタフェース75には、SSD(Solid State Drive)やHDD(Hard Disk Drive)などより構成される記憶部79や、モデムなどより構成される通信部80が接続される場合もある。
 例えばこの情報処理装置70をメインサーバ2として考える場合、記憶部79は、記憶制御部46によって情報が記憶されるストレージ23の一例と考えることができる。
 通信部80は、インターネット等の伝送路を介しての通信処理や、各種機器との有線/無線通信、バス通信などによる通信を行う。
The input/output interface 75 may be connected to a storage unit 79 configured with a solid state drive (SSD) or a hard disk drive (HDD) or a communication unit 80 configured with a modem or the like.
For example, when the information processing device 70 is considered as the main server 2 , the memory unit 79 can be considered as an example of the storage 23 in which information is stored by the memory control unit 46 .
The communication unit 80 performs communication processing via a transmission path such as the Internet, and communication with various devices via wired/wireless communication, bus communication, and the like.

 入出力インタフェース75にはまた、必要に応じてドライブ82が接続され、フラッシュメモリ、メモリカード、磁気ディスク、光ディスク、光磁気ディスクなどのリムーバブル記録媒体81が適宜装着される。
 ドライブ82により、リムーバブル記録媒体81からは画像ファイル等のデータファイルや、各種のコンピュータプログラムなどを読み出すことができる。読み出されたデータファイルは記憶部79に記憶されたり、データファイルに含まれる画像や音声が表示部77や音声出力部78で出力されたりする。またリムーバブル記録媒体81から読み出されたコンピュータプログラム等は必要に応じて記憶部79にインストールされる。
A drive 82 is also connected to the input/output interface 75 as required, and a removable recording medium 81 such as a flash memory, a memory card, a magnetic disk, an optical disk, or a magneto-optical disk is appropriately attached thereto.
The drive 82 allows data files such as image files and various computer programs to be read from the removable recording medium 81. The read data files are stored in the storage unit 79, and images and sounds contained in the data files are output on the display unit 77 and the sound output unit 78. Furthermore, the computer programs and the like read from the removable recording medium 81 are installed in the storage unit 79 as necessary.

 この情報処理装置70では、ソフトウエアを、通信部80によるネットワーク通信やリムーバブル記録媒体81を介してインストールすることができる。或いは当該ソフトウエアは予めROM72や記憶部79等に記憶されていてもよい。 In this information processing device 70, software can be installed via network communication by the communication unit 80 or via a removable recording medium 81. Alternatively, the software may be pre-stored in the ROM 72, the storage unit 79, etc.

 例えばこの情報処理装置70をメインサーバ2として考える場合、ソフトウエアにより、上述の演算部21、提示制御部22としての機能が設けられ、CPU71や画像処理部85により、これらの機能による処理が実行される。 For example, if the information processing device 70 is considered as the main server 2, the software provides the functions of the calculation unit 21 and presentation control unit 22 described above, and the CPU 71 and image processing unit 85 execute the processing of these functions.

 センサ部84は、イメージセンサ(カメラ)、ジャイロセンサ、加速度センサ、位置センサなど、何らかのセンシングデータSSを出力する部位として示している。例えば情報処理装置70が端末装置5として実現される場合は、ユーザの行動、挙動、位置などを検出できるようにセンサ部84が設けられる。
The sensor unit 84 is shown as a part that outputs some kind of sensing data SS, such as an image sensor (camera), a gyro sensor, an acceleration sensor, a position sensor, etc. For example, when the information processing device 70 is realized as a terminal device 5, the sensor unit 84 is provided so as to detect the actions, behavior, position, etc. of the user.

<2.試合連携サービスの画面及び内容例>
 試合連携サービスにおいて実現される端末装置5における画面例を説明する。
 試合連携サービスは、野球等の競技の試合に関連する試合情報に応じて、競技に関係するデジタルアイテム、画像コンテンツ、ゲームコンテンツ、仮想空間などを提供するサービスである。
<2. Screen and content examples of the match linkage service>
An example of a screen on the terminal device 5 implemented in the match linkage service will be described.
The match linkage service is a service that provides digital items, image content, game content, virtual space, and the like related to a sport in accordance with match information related to a sport such as baseball.

 図5は試合連携サービス機能40による提供される画面の一部の例を示している。例えばトップ画面100、試合画面110、応援画面120、ミニゲーム画面130、ロッカールーム画面140などがある。ユーザはこれらの各画面を任意に遷移させて閲覧できる。 FIG. 5 shows some examples of screens provided by the match linkage service function 40. For example, there is a top screen 100, a match screen 110, a cheering screen 120, a mini-game screen 130, and a locker room screen 140. The user can view these screens by transitioning between them as desired.

 トップ画面100では、対戦中の試合の情報や、画面遷移のための画像、ログインユーザに関する画像などが表示される。 The top screen 100 displays information about the ongoing match, images for screen transitions, images related to the logged-in user, etc.

 トップ画面100や後述する他の画面には、選択可能な項目としてアイコン101、102、103、104、105が表示される。 On the top screen 100 and other screens described below, icons 101, 102, 103, 104, and 105 are displayed as selectable items.

 アイコン101がマイページとしてのロッカールーム画面140に遷移する操作のためのアイコンである。アイコン102は応援画面120に遷移する操作のためのアイコンである。アイコン103は試合画面110に遷移する操作のためのアイコンである。アイコン104はミニゲーム画面130に遷移する操作のためのアイコンである。アイコン105は不図示の設定画面に遷移する操作のためのアイコンである。 Icon 101 is an icon for an operation to transition to a locker room screen 140 as a personal page. Icon 102 is an icon for an operation to transition to a cheering screen 120. Icon 103 is an icon for an operation to transition to a match screen 110. Icon 104 is an icon for an operation to transition to a mini-game screen 130. Icon 105 is an icon for an operation to transition to a setting screen (not shown).

 またこれらのアイコンは現在の画面を提示するものでもあり、例えば図6はロッカールーム画面140の例であるが、この場合、ロッカールーム画面140を示すアイコン101は他のアイコンと異なる態様で表示される。例えば、図6では、アイコン101の枠が他のアイコンの枠よりも太く強調して表示されている。 These icons also indicate the current screen. For example, FIG. 6 shows an example of the locker room screen 140. In this case, the icon 101 representing the locker room screen 140 is displayed in a different manner from the other icons. For example, in FIG. 6, the frame of the icon 101 is displayed with a thicker frame for emphasis than the frames of the other icons.

 図5の試合画面110は、観戦している試合の経過等に応じて変化するデジタルアイテムや試合の予想クイズ等がユーザに提供される画面である。
 デジタルアイテムとしては、例えば競技に関連するアイテムとされ、野球の場合は、野球のボール、バット、グローブなどのデザインが考えられる。例えばボールにチームロゴ等が付加されたデザインのアイテムなどである。
The game screen 110 in FIG. 5 is a screen that provides the user with digital items that change according to the progress of the game being watched, a quiz for predicting the game, and the like.
Digital items are, for example, items related to a sport, and in the case of baseball, designs of baseball balls, bats, gloves, etc. are conceivable. For example, items may have a design in which a team logo or the like is added to a ball.

 また試合画面110では、デジタルアイテムや試合に関する情報やユーザに関する情報が表示される。例えばチーム/スコア情報や、ポイント情報などが表示される。ポイント情報とは、ユーザ、つまり試合連携サービスにログインして端末装置5で提供画面を見ているユーザについてのポイントである。
 試合連携サービスでは、試合状況やユーザ状況に応じてユーザにポイントを付与している。例えばポイントは、ユーザの入力や行動などに応じて付与されたり、クイズに対するユーザの回答結果により付与されたり、ユーザが選択した選手の成績などにより付与される。
Furthermore, the match screen 110 displays information about digital items, the match, and the user. For example, team/score information, point information, etc. are displayed. The point information is points for the user, that is, the user who has logged in to the match linkage service and is viewing the provided screen on the terminal device 5.
In the match linkage service, points are awarded to users according to the match situation and the user situation. For example, points are awarded according to the user's input or actions, the user's answers to quizzes, or the performance of players selected by the user.

 応援画面120は、ファン同士が応援で盛り上がるための画面で、ファン同士のコミュニケーションを提供したり、盛り上がった瞬間を共有できるようにしたりする画面である。
 後述するキーモーメントコンテンツの再生なども、応援画面120において行われるようにすることが考えられる。
The cheering screen 120 is a screen for fans to cheer on each other and get excited, providing communication between fans and allowing fans to share exciting moments.
It is also possible that playback of key moment content, which will be described later, is performed on cheering screen 120.

 ミニゲーム画面130ユーザにゲームコンテンツを提供する画面である。
 例えばバッティングゲーム、ピッチングゲームなどのゲームコンテンツが用意され、ユーザはこれらのゲームを楽しむことができる。特にこの場合のゲーム上の投球や打撃が、実際に行われた試合のトラッキングデータに基づいて設定されることで、ユーザはゲーム画面上で、実際の選手の投球等を体験できるようになる。
The mini-game screen 130 is a screen that provides game contents to the user.
For example, game content such as batting games and pitching games are available, and users can enjoy these games. In particular, in this case, pitching and batting in the game are set based on tracking data from an actual match, so that users can experience pitching and the like of actual players on the game screen.

 ロッカールーム画面140は、ユーザのホームページとして、ロッカールームを仮想的に設けたページである。図6にロッカールーム画面140を示している。
 ロッカールーム画面140は、例えば試合連携サービスに加入するユーザ毎に、そのユーザのホームページとして用意される。
 例えば図6のロッカールーム画面140は、ユーザ名202として表示されるユーザについて設けられた画面で、そのユーザが試合連携サービスにログインした場合のホームページとなる。
The locker room screen 140 is a page that virtually displays a locker room as a user's home page. The locker room screen 140 is shown in FIG.
The locker room screen 140 is prepared as a home page for each user who subscribes to the match linkage service, for example.
For example, the locker room screen 140 in FIG. 6 is a screen provided for a user displayed as the user name 202, and serves as the home page when that user logs in to the match linkage service.

 このロッカールーム画面140には、仮想的なロッカー201が表示される。
 このロッカー201は、例えば基本的な形状が設定されるが、ユーザ毎に多様な状態で表示される。
 例えばロッカー201内の棚や床などに、各種のアイテム203を配置できる。アイテム203は、ユーザが、自分が所有するアイテム203の中から任意に選択して、任意の場所に配置できるものとする。
On this locker room screen 140, virtual lockers 201 are displayed.
The locker 201 is set to, for example, a basic shape, but is displayed in various states for each user.
For example, various items 203 can be placed on shelves or the floor inside the locker 201. The user can arbitrarily select an item 203 from among the items 203 he or she owns and place it in an arbitrary location.

 アイテム203としては、例えばバット、ヘルメット、帽子、スパイク、ユニフォームなどの野球に使用する物品がある。またアイテム203としては、私服、植木、小物、食品など、野球に関係ない物品もある。さらに選手フィギュア、チームのキャラクターロゴグッズ、チームマスコット人形、メダル等、いわゆる球団や選手のファン向けに用意された物品があってもよい。試合連携サービスは、各ユーザが、これらの各種の仮想的な物品としてアイテム203を入手できるように有償又は無償で提供する。 Items 203 include items used in baseball, such as bats, helmets, hats, cleats, and uniforms. Items 203 also include items unrelated to baseball, such as casual clothing, plants, accessories, and food. There may also be items prepared for fans of teams and players, such as player figures, team character logo goods, team mascot dolls, and medals. The match collaboration service provides items 203 as various virtual items, either for a fee or free of charge, so that each user can obtain them.

 ユーザは、例えばバーチャルショップでアイテム203を購入したり、クイズ正答の賞品などとしてアイテム203を入手したり、他人と交換したりすることができる。そのようにして入手したアイテム203を、自分のロッカー201に配置することができる。 The user can, for example, purchase the item 203 in a virtual shop, obtain the item 203 as a prize for answering a quiz correctly, or exchange it with another person. The user can place the item 203 obtained in this way in his or her own locker 201.

 ここでアイテム203Aは、後述するキーモーメントコンテンツを示すアイテムとされる。キーモーメントコンテンツを取得した場合、そのキーモーメントコンテンツに対応するアイテム203Aもユーザに付与され、ロッカー201に配置できる。 Here, item 203A is an item that indicates key moment content, which will be described later. When key moment content is acquired, item 203A corresponding to that key moment content is also given to the user, and can be placed in locker 201.

 またユーザによって、ロッカー201自体の色や、棚の配置などをカスタマイズできるようにしてもよい。 Users may also be able to customize the color of the locker 201 itself and the arrangement of the shelves.

 またロッカー201には、ポスター204が貼付された状態とされる。このポスター204は、各種の画像の表示部として機能する。例えばユーザが好きな選手の静止画や動画、ユーザが投稿した静止画や動画などがポスター204として表示される。 A poster 204 is also attached to the locker 201. This poster 204 functions as a display unit for various images. For example, still images and videos of the user's favorite player, and still images and videos posted by the user are displayed on the poster 204.

 またロッカー201内には仮想的なスピーカ206を配置することができる。
 例えばユーザは、アイテム203の1つとして用意されているスピーカ206を、アイテム画面250から購入したり、或いはクイズ結果、ポイントランキング、試合結果などに応じて贈呈されたりすることで入手できる。
 ユーザが入手したスピーカ206をロッカー201に配置すると、そのロッカールーム画面140を閲覧した際に音声出力が行われるようになる。
In addition, a virtual speaker 206 can be placed inside the locker 201 .
For example, the user can obtain a speaker 206 provided as one of the items 203 by purchasing it from the item screen 250 or by having it presented to them based on quiz results, point rankings, match results, etc.
When the user places the speaker 206 he or she has acquired in the locker 201, audio is output when the locker room screen 140 is viewed.

 例えばユーザがスピーカ206をロッカー201に配置すると、端末装置5で、自分がそのロッカールーム画面140が閲覧しているときには、音声や音楽を再生出力させることができる。
 例えば選手のテーマソングが再生される。また他のユーザが試合の実況をしている場合に、その実況、解説を聞くことができる。
For example, if a user places a speaker 206 on the locker 201, the user can play back sound or music on the terminal device 5 while viewing the locker room screen 140.
For example, the theme song of a player is played. Also, if other users are commenting on the game, you can listen to the commentary.

 またロッカールーム画面140には、ユーザ情報205が表示される。例えばそのユーザについてのイイネアイコン、イイネ数、コメントアイコン、コメント数、ユーザの紹介文などが表示される。 The locker room screen 140 also displays user information 205. For example, the like icon, number of likes, comment icon, number of comments, and a description of the user are displayed.

 このようにユーザ毎に固有のロッカールーム画面140が形成され、仮想的なクラブハウス内に配置される。
 従って多数のユーザによって、多数の仮想的なロッカールーム画面140が形成されることになる。
In this manner, a unique locker room screen 140 is created for each user and placed within the virtual clubhouse.
Therefore, a large number of virtual locker room screens 140 will be created by a large number of users.

 ログインした各ユーザは、自分のホームページであるロッカールーム画面140から、他人のロッカールーム画面140や選手のロッカールーム画面140に遷移して閲覧することもできる。 Each logged-in user can also transition from their own home page, the locker room screen 140, to other users' locker room screens 140 or players' locker room screens 140 for viewing.

 またロッカールーム画面140に配置したアイテム203の一部は、ゲームコンテンツにおいて使用することもできる。例えばアイテム203としての特別なバットを所有している場合、そのアイテム203をバッティングゲームに使用すると、ゲームコンテンツ中で、打撃能力パラメータがアップされるようにするなどである。 Furthermore, some of the items 203 placed on the locker room screen 140 can also be used in the game content. For example, if a player owns a special bat as the item 203, using the item 203 in a batting game can increase a batting ability parameter in the game content.

 以下、キーモーメントコンテンツについて説明する。
 図7はキーモーメント通知画面121の例である。これは、例えばユーザが観戦中において、端末装置5で応援画面120を閲覧しているときに、キーモーメントが発生し、図8等に示すキーモーメントコンテンツ300(以下「KMコンテンツ300」と略記する)が生成された場合に表示される画面である。
Key moment content will now be described.
Fig. 7 is an example of a key moment notification screen 121. This is a screen that is displayed when, for example, a key moment occurs while the user is viewing the cheering screen 120 on the terminal device 5 while watching a game, and key moment content 300 (hereinafter abbreviated as "KM content 300") shown in Fig. 8 and the like is generated.

 なお、現在の画面が応援画面120以外の画面であっても、キーモーメント通知画面121が表示されるようにしてもよい。また画面全体ではなく、キーモーメントが発生したときにポップアップウインドウなどで通知が行われ、ユーザが応援画面120に遷移すると、図7のようなキーモーメント通知画面121が表示されるものでもよい。これら通知の態様はあくまでも一例である。 Note that key moment notification screen 121 may be displayed even if the current screen is a screen other than cheering screen 120. Also, instead of displaying the entire screen, a notification may be given in a pop-up window or the like when a key moment occurs, and when the user transitions to cheering screen 120, key moment notification screen 121 as shown in FIG. 7 may be displayed. These notification modes are merely examples.

 キーモーメント通知画面121では、通知メッセージ122、閉ボタン123、視聴ボタン124などが表示される。
 通知メッセージ122により、キーモーメントが追加されたこと、つまりキーモーメントを表すKMコンテンツ300が生成され、視聴可能であることがユーザに伝えられる。ユーザは視聴するか否かを選択できる。
 視聴しない場合は、ユーザは閉ボタン123でキーモーメント通知画面121を閉じればよい。視聴する場合は、ユーザは視聴ボタン124を操作する。これにより端末装置5においてKMコンテンツ300の再生が開始される。
On the key moment notification screen 121, a notification message 122, a close button 123, a watch button 124, etc. are displayed.
The notification message 122 informs the user that a key moment has been added, that is, that a KM content 300 representing the key moment has been generated and is available for viewing. The user can choose whether or not to view the content.
If the user does not wish to view the content, he or she can simply close the key moment notification screen 121 using the close button 123. If the user wishes to view the content, he or she operates the view button 124. This causes the playback of the KM content 300 on the terminal device 5 to begin.

 図8、図9はKMコンテンツ300の例を示している。これらは、KMコンテンツ300を、キーモーメントの対象プレーを表現する画像コンテンツとする例である。 FIGS. 8 and 9 show examples of KM content 300. These are examples in which KM content 300 is image content that represents the target play of a key moment.

 例えば図8は、或る選手がホームランを打ったときをキーモーメントと判定したときに、そのホームランを対象プレーとして生成されたKMコンテンツ300である。例えばスタジアム内でのホームランの打球の軌道や、打球速度、飛距離、プレー発生の日時やイニング、打者の名前などを表示するCGコンテンツである。例えばCGによる動画として、打球が軌跡を描きながらスタンドインするようなコンテンツが想定される。 For example, FIG. 8 shows KM content 300 that is generated with a home run as the target play when a certain player hits a home run and that home run is determined to be a key moment. For example, this is CG content that displays the trajectory of the home run ball in the stadium, the ball speed, the distance traveled, the date and time of the play, the inning, the name of the batter, etc. For example, content that shows a CG video of the ball tracing a trajectory as it hits the ball into the stands is envisioned.

 図9は、或る投手の奪三振シーンをキーモーメントと判定したときに、その奪三振の投球を対象プレーとして生成されたKMコンテンツ300である。ボールの軌道や打者のスイングなどがCG動画として表現される。
 なお、この例では、観戦ボタン310も表示されるものとしている。観戦ボタン310は、ユーザが、このKMコンテンツ300の対象プレーを観戦していたとの意思表示を行うための操作子である。
9 shows KM content 300 that is generated when a pitcher strikes out a batter and the strikeout pitch is determined to be a key moment. The trajectory of the ball, the batter's swing, etc. are expressed as CG video.
In this example, a watch button 310 is also displayed. The watch button 310 is an operator for the user to indicate his/her intention to watch the target play of the KM content 300.

 これらのようにキーモーメントとされた対象プレーを表現するKMコンテンツは、例えばメインサーバ2の演算部21が、アイテム/コンテンツ生成部45の機能により、試合情報における対象プレーに関するデータに基づいて生成する。さらにこのKMコンテンツ300について演算部21は、ユーザ情報を用いて内容の付加又は変更を行うといった編集を行うことができる。説明上、このような編集を行ったKMコンテンツ300を「編集後KMコンテンツ300S」と呼ぶこととする。 The KM content expressing such target plays that have been designated as key moments is generated, for example, by the calculation unit 21 of the main server 2 using the functions of the item/content generation unit 45 based on data related to the target plays in the match information. Furthermore, the calculation unit 21 can edit this KM content 300 by adding or changing the content using user information. For the sake of explanation, the KM content 300 that has been edited in this way will be referred to as the "edited KM content 300S."

 例えば図10は編集後KMコンテンツ300Sの一例を示している。これは図8のようなKMコンテンツ300に対して、観戦アイコン301を付加する編集を行った例である。例えばこれはスタジアムで観戦しているというユーザ状況に対応する編集であり、扇形の野球グラウンドを模した意匠上に「これ観た」という文字を載せたアイコンの例としている。
 例えばユーザ状況情報における位置情報から、スタジアムに居ることが推定されるユーザに提供する編集後KMコンテンツ300Sとして、このように実際に現地で観ていることを示す観戦アイコン301を付与する。
For example, Fig. 10 shows an example of an edited KM content 300S. This is an example of editing the KM content 300 as shown in Fig. 8 by adding a spectator icon 301. For example, this is editing corresponding to a user situation in which the user is watching a game at a stadium, and is an example of an icon with the words "I watched this" on a design imitating a fan-shaped baseball field.
For example, as the edited KM content 300S to be provided to a user who is presumed to be at the stadium based on the position information in the user situation information, a spectator icon 301 indicating that the user is actually watching the game at the stadium is added.

 例えばこのような編集後KMコンテンツ300Sは、当該ユーザが取得できるようにする。例えばユーザが試合連携サービス上での電子的な所有物としてコレクションできるようにする。その場合に、デジタルアイテムと編集後KMコンテンツ300Sを関連づけるようにする。例えば演算部21は、図11のような、編集後KMコンテンツ300Sに対応するアイテム203Aを生成する。例えばアイテム203Aは、「〇〇選手第12号ホームラン」のように、どの対象プレーについての編集後KMコンテンツ300Sに対応するものかが明示される。例えばアイテム203Aは、ボールのデザインとともにキーモーメントの内容を表記したような、モーメントボールとされ、ユーザが、どのようなキーモーメントを表すものかがわかるようにする。 For example, such edited KM content 300S can be acquired by the user. For example, the user can collect it as electronic property on the match linkage service. In this case, a digital item is associated with the edited KM content 300S. For example, the calculation unit 21 generates an item 203A corresponding to the edited KM content 300S as shown in FIG. 11. For example, the item 203A is clearly indicated as corresponding to the edited KM content 300S for a given target play, such as "XXX player's 12th home run." For example, the item 203A is a moment ball, which indicates the content of the key moment along with the design of the ball, so that the user can know what key moment it represents.

 演算部21は、このアイテム203A及び編集後KMコンテンツ300Sを、ユーザに関連づけ、ユーザが取得した状態とする。
 そして演算部21は、ユーザが、ホームページであるロッカールーム画面140や、図示しないアイテムコレクションの画面などにおいて、アイテム203Aを掲載できるようにする。図6のロッカールーム画面140では、アイテム203Aがロッカー201に配置されている例を示している。
The calculation unit 21 associates the item 203A and the post-edit KM content 300S with the user, and places them in a state of being acquired by the user.
The calculation unit 21 then enables the user to post the item 203A on the locker room screen 140, which is the home page, or on an item collection screen (not shown), etc. The locker room screen 140 in FIG. 6 shows an example in which the item 203A is placed in a locker 201.

 ユーザは、アイテム203Aとして取得した編集後KMコンテンツ300Sを任意に再生させることができる。例えばユーザは、アイテム203Aについて操作を行うことで、図10のような編集後KMコンテンツ300Sを視聴できる。これにより、対象プレーを思い出すとともに、観戦アイコン301によって、自分が現地に居て対象プレーを観ていたことを思い起こすことができる。 The user can play back the edited KM content 300S acquired as item 203A at will. For example, the user can view the edited KM content 300S as shown in FIG. 10 by performing an operation on item 203A. This allows the user to remember the target play, and the watching icon 301 allows the user to recall that he or she was there watching the target play.

 編集後KMコンテンツ300Sは、コンテンツ提供先となる全てのユーザに同様の編集がなされたものでもよいし、或る属性のユーザに対して共通の編集がなされたものでもよい。さらに編集後KMコンテンツ300Sは、ユーザ個人毎にユニークな編集がなされたものでもよい。 The edited KM content 300S may be edited in the same way for all users to whom the content is provided, or may be edited in common for users with certain attributes. Furthermore, the edited KM content 300S may be edited uniquely for each individual user.

 また例えば図7の視聴ボタン124が押されて最初にユーザに視聴させる段階から編集後KMコンテンツ300Sを視聴させてもよいし、最初はKMコンテンツ300を視聴させた後、ユーザの反応や状況に応じて編集後KMコンテンツ300Sを生成し、その後、ユーザが編集後KMコンテンツ300Sを視聴できるようにしてもよい。
 最初にKMコンテンツ300を視聴させた後、ユーザに編集後KMコンテンツ300Sを付与するための条件が設定されてもよい。例えばユーザが所定の操作を行うことや、ユーザ状況がどのような状況であるかなどによって、編集後KMコンテンツ300Sがユーザに付与されるようにしてもよい。
 またKMコンテンツ300と編集後KMコンテンツ300Sの両方をユーザが取得できるようにしてもよい。
 KMコンテンツ300や編集後KMコンテンツ300Sのユーザへの提供の手順は多様に考えられる。
Also, for example, the edited KM content 300S may be viewed from the stage when the user first presses the viewing button 124 in FIG. 7, or the KM content 300 may be viewed initially, and then the edited KM content 300S may be generated depending on the user's reaction and situation, and the user may then be able to view the edited KM content 300S.
A condition may be set for granting the edited KM content 300S to the user after the user first views the KM content 300. For example, the edited KM content 300S may be granted to the user depending on whether the user performs a predetermined operation or the user's situation.
Also, the user may be allowed to acquire both the KM content 300 and the edited KM content 300S.
There are various possible procedures for providing the KM contents 300 and the edited KM contents 300S to the user.

 編集後KMコンテンツ300Sの例をさらに示す。
 図12は観戦アイコン301を付加した例であるが、これはユーザが、試合を放送や配信などにより間接的に観戦していた場合の例としている。例えばモニタ画面の意匠上に「これ観た」という文字を載せたアイコンの例としている。
 例えばユーザ状況情報における位置情報から、スタジアム外に居ることが推定されるユーザに提供する編集後KMコンテンツ300Sとして、このように間接的な観戦を示す観戦アイコン301を付与する。
An example of the edited KM content 300S is further shown.
12 shows an example in which a spectator icon 301 is added, which is an example in which the user is indirectly watching a game through broadcasting or distribution, etc. For example, the icon has the words "I watched this" written on the monitor screen design.
For example, a spectating icon 301 indicating indirect spectating is added to the edited KM content 300S to be provided to a user who is presumed to be outside the stadium based on the location information in the user situation information.

 なお、図10や図12の観戦アイコン301は、例えば図9のような観戦ボタン310を操作して意思表示したユーザに対して付与するものとしてもよい。
 また例えばスタジアムで直接観戦しているユーザがGPS(Global Positioning System)情報やビーコン情報、或いはチケット情報などにより判定できる場合は、ユーザが操作しなくとも、図10のような観戦アイコン301を付加し、それ以外のユーザについては、観戦ボタン310が操作されることで、図12の観戦アイコン301を付加するなどとしてもよい。
The watching icon 301 in FIG. 10 or FIG. 12 may be given to a user who has expressed their intention by operating the watching button 310 as shown in FIG.
Also, for example, if a user who is watching the game directly at the stadium can be determined based on GPS (Global Positioning System) information, beacon information, or ticket information, a spectating icon 301 as shown in FIG. 10 may be added without any operation by the user, and for other users, the spectating icon 301 as shown in FIG. 12 may be added by operating a spectating button 310.

 図13はさらにユーザ情報に基づいた記念アイコン302を付加した編集後KMコンテンツ300Sを示している。例えば試合当日がユーザの誕生日であった場合に、誕生日であることを示す記念アイコン302を付加する。これによりユーザは、当該編集後KMコンテンツ300Sで示される対象プレーが自分の誕生日のプレーであったことを思い起こすことができる。 FIG. 13 shows the edited KM content 300S to which a commemorative icon 302 based on user information has been added. For example, if the day of a match is the user's birthday, a commemorative icon 302 indicating that it is the birthday is added. This allows the user to recall that the target play shown in the edited KM content 300S was a play on their birthday.

 記念アイコン302としては、ユーザ情報により判定できるものであればよく、ユーザによる試合連携サービスへの情報入力などに応じたものとすればよい。例えばユーザが誕生日の情報を試合連携サービスへ提供しているのであれば、それを参照して誕生日を示す記念アイコン302を付加することができる。
 その意味で、ユーザにとっての何らかの記念日、例えば結婚記念日、子供の誕生日などの情報がユーザ情報に含まれていれば、それに応じた記念アイコン302を付加できる。
The commemorative icon 302 may be any icon that can be determined from user information and may correspond to information input by the user to the match linking service, etc. For example, if the user provides birthday information to the match linking service, the commemorative icon 302 indicating the birthday may be added by referring to the birthday information.
In that sense, if the user information includes information about a certain anniversary for the user, such as a wedding anniversary or a child's birthday, a commemorative icon 302 corresponding to that anniversary can be added.

 図14は観戦時のユーザの位置に関する情報を記念アイコン302とする例である。例えば観ている場所がスタジアムではない場合として、自宅での観戦、スポーツバーなどの店舗での観戦、パブリックビューイングでの観戦などを示す記念アイコン302を付加する。これにより対象プレーの観戦場所を思い起こすことができる。
 またスタジアムでの観戦の場合に、観戦アイコン301を現地のアイコンとするとともに、記念アイコン302は、座席位置や座席エリア(例えばバックネット裏、1塁側内野席、レフト側外野席等)などを示すものとして付加してもよい。
14 shows an example in which information about the user's location when watching a game is used as a commemorative icon 302. For example, if the location of the user watching the game is not the stadium, a commemorative icon 302 indicating watching the game at home, watching the game at a store such as a sports bar, watching the game at a public viewing, etc. can be added. This allows the user to recall the location of the game where the target play was watched.
In addition, when watching a game at a stadium, the watching icon 301 may be an icon of the local area, and the commemorative icon 302 may be added to indicate the seat location or seat area (e.g., behind the backstop, infield seats on the first base side, outfield seats on the left field side, etc.).

 ここまで観戦アイコン301や記念アイコン302を付加した編集後KMコンテンツ300Sを示したが、KMコンテンツ300の内容自体を変更する編集を行ってもよい。例えば図8のようなKMコンテンツ300を当初は視聴させるが、ユーザに付与する場合には、視点位置を変更したものに編集する。例えば図8のKMコンテンツ300は1塁側内野席の視点からのCG画像としているが、ユーザが3塁側内野席で観戦していた場合、編集後KMコンテンツ300Sは、図15のように視点位置を3塁側内野席としたCG画像にするという例がある。
 また例えば対象プレーの選手が、ユーザの応援選手の場合は、編集後KMコンテンツ300Sは当該選手の視点からのCG画像とするようなことも考えられる。
So far, the edited KM content 300S to which the spectator icon 301 and the commemorative icon 302 have been added has been shown, but the content of the KM content 300 itself may be edited to change. For example, the KM content 300 as shown in Fig. 8 is initially viewed, but when it is given to a user, it is edited to change the viewpoint position. For example, the KM content 300 in Fig. 8 is a CG image from the viewpoint of the infield seats on the first base side, but if the user is watching the game from the infield seats on the third base side, the edited KM content 300S may be a CG image with the viewpoint position set to the infield seats on the third base side, as shown in Fig. 15.
Also, for example, if the player in the target play is a player supported by the user, the edited KM content 300S may be a CG image from the player's viewpoint.

 図16、図17は、ユーザ固有の情報で編集する例である。例えばユーザのアバターを用いる。
 図16はKMコンテンツ300に観戦アイコン301やアバター303を付加した編集後KMコンテンツ300Sの例である。
 図17は、KMコンテンツ300の内容を編集する例で、選手のCG画像をユーザのアバター303に置き換えるようにした例である。
 これらのようにユーザに固有の情報を付加したり、固有の情報で画像内容を変更したりすることも考えられる。
 このような編集後KMコンテンツ300Sは、ユーザ固有のものとすることができる。
16 and 17 show an example of editing with user-specific information, for example, a user's avatar.
FIG. 16 shows an example of edited KM content 300S in which a spectator icon 301 and an avatar 303 are added to the KM content 300. In FIG.
FIG. 17 shows an example of editing the KM content 300, in which a CG image of a player is replaced with an avatar 303 of the user.
In this way, it is possible to add information specific to a user, or to change the image contents using the specific information.
Such edited KM content 300S can be user specific.

<3.処理例>
 以上のように試合連携サービスにおいてKMコンテンツ300に関する処理を実現するためのメインサーバ2の処理例を図18から図23で説明する。これらの処理はメインサーバ2の演算部21における図3の試合連携サービス機能40により実行される処理例であり、特にアイテム/コンテンツ生成部45によるコンテンツの提供に関する処理を示している。
<3. Processing example>
As described above, examples of processing by the main server 2 for implementing processing related to the KM content 300 in the match linkage service will be described with reference to Fig. 18 to Fig. 23. These processes are examples of processing executed by the match linkage service function 40 in Fig. 3 in the calculation unit 21 of the main server 2, and in particular, show processing related to the provision of content by the item/content generation unit 45.

 この演算部21の処理は、図4の情報処理装置70をメインサーバ2の構成とした場合は、CPU71により実行される処理である。
 但し、一部の処理が端末装置5側で行われてもよい。その処理は、図4の情報処理装置70を端末装置5の構成とした場合は、CPU71により実行される。
 また図18の処理は、試合連携サービスにログインしているユーザの端末装置5毎に行われている。
The processing of the calculation unit 21 is executed by the CPU 71 when the information processing device 70 in FIG.
However, a part of the processing may be performed on the side of the terminal device 5. When the information processing device 70 in FIG.
The process of FIG. 18 is performed for each terminal device 5 of a user who is logged in to the match linkage service.

 KMコンテンツ300の処理について演算部21は、図18のステップS101で試合開始を監視する。KMコンテンツ300は試合におけるキーモーメントに応じて生成するものであるため、演算部21は試合開始に応じてステップS102以降の処理を開始する。 In processing the KM content 300, the calculation unit 21 monitors the start of the match in step S101 of FIG. 18. Since the KM content 300 is generated in response to key moments in the match, the calculation unit 21 starts processing from step S102 onwards in response to the start of the match.

 ステップS102で演算部21は試合終了を監視する。試合継続中はステップS103に進む。
 ステップS103で演算部21はキーモーメント判定を行う。
 図19にキーモーメント判定の処理例を示している。
In step S102, the calculation unit 21 monitors the end of the match. If the match is still in progress, the process proceeds to step S103.
In step S103, the calculation unit 21 performs a key moment determination.
FIG. 19 shows an example of a process for determining a key moment.

 キーモーメント判定では、演算部21はステップS150で試合経過情報を取得し、プレー判定を行う。つまり現在どのようなプレーが起こったかを判定する。
 例えば投球結果、打撃結果として、三振、四死球、ヒット、二塁打、三塁打、ホームラン、エラー、盗塁など、個々のプレーの種別や、そのプレーに関係する選手を認識する。
In the key moment determination, the calculation unit 21 acquires game progress information in step S150 and performs a play determination, i.e., determines what kind of play is currently occurring.
For example, pitching and batting results include strikeouts, walks, hits, doubles, triples, home runs, errors, stolen bases, and other types of individual plays, as well as the players involved in those plays.

 ステップS151で演算部21は、認識したプレーが共通キーモーメントと評価できるものか否かを判定する。
 ここで共通キーモーメントというのは、ユーザ個人に限らず、一般的に、感動したりエキサイトしたりする瞬間と評価できるプレーが発生した瞬間である。
 一方、後述する個別キーモーメントは、ユーザの属性に応じて、ユーザ個人毎に判定されるキーモーメントとする。
In step S151, the calculation unit 21 determines whether or not the recognized play can be evaluated as a common key moment.
Here, the common key moment is a moment when a play occurs that can be evaluated as a moving or exciting moment, not limited to an individual user, but generally.
On the other hand, an individual key moment, which will be described later, is a key moment determined for each individual user according to the attributes of the user.

 共通キーモーメントとしては、例えば得点シーンのプレー、ホームラン、奪三振、2000本安打達成などのメモリアルなプレー、ファインプレー、トリプルプレーなどを設定しておけばよい。或いは、ステップS150で試合経過情報だけでなく、スタジアムでの観客の歓声に関する情報(歓声の大きさ、拍手の音など)も入力し、演算部21はプレーの際の音を共通キーモーメントの判定に用いてもよい。 Common key moments can be, for example, plays during scoring scenes, memorable plays such as home runs, strikeouts, and 2,000 hits, fine plays, triple plays, etc. Alternatively, in step S150, not only game progress information but also information regarding the cheers of the spectators in the stadium (volume of cheers, sound of applause, etc.) can be input, and the calculation unit 21 can use the sounds of the plays to determine the common key moments.

 共通キーモーメントと判定された場合は、演算部21はステップS154に進み、試合連携サービスにログインしている全ユーザを対象とするキーモーメント発生と判定する。 If it is determined that a common key moment has occurred, the calculation unit 21 proceeds to step S154 and determines that a key moment has occurred that applies to all users logged in to the match linkage service.

 共通キーモーメントと判定されない場合は、演算部21はステップS152に進み、処理対象のユーザのユーザ情報を確認する。例えばユーザのファンチームや応援している選手を判定する。そして演算部21はステップS153で処理対象のユーザにとっての個別キーモーメントが発生したか否かを判定する。 If it is not determined to be a common key moment, the calculation unit 21 proceeds to step S152 and checks the user information of the user being processed. For example, it determines the user's favorite team and the player they support. Then, in step S153, the calculation unit 21 determines whether an individual key moment has occurred for the user being processed.

 例えばユーザが選手Aを応援しているとのユーザ情報があれば、選手Aの打席は、全てそのユーザに対する個別キーモーメントとしてもよい。
 また球団Xのファンであれば、球団Xの得点プレーや勝利決定のプレーは、そのユーザに対する個別キーモーメントと評価できる。
 そのような個別キーモーメントと判定できる場合は、演算部21はステップS155に進み、処理対象のユーザへのキーモーメント発生と判定する。
For example, if there is user information indicating that a user is rooting for player A, all of player A's at bats may be set as individual key moments for that user.
Also, if a user is a fan of baseball team X, a scoring play or a winning play for baseball team X can be evaluated as an individual key moment for that user.
If it can be determined that this is an individual key moment, calculation unit 21 proceeds to step S155, where it determines that a key moment has occurred for the user being processed.

 直前のプレーが共通キーモーメントでも個別キーモーメントでもないと判定した場合は、演算部21はステップS153から図19の処理、即ち図18のステップS103の処理を終える。 If it is determined that the immediately preceding play was neither a common key moment nor an individual key moment, the calculation unit 21 ends the process from step S153 to FIG. 19, i.e., the process of step S103 in FIG. 18.

 以上のキーモーメント判定により、キーモーメントは発生していないと判定したときは、演算部21は図18のステップS104からステップS102に戻る。
 キーモーメント判定により、キーモーメント発生と判定したときは、演算部21はステップS104からステップS105に進み、KMコンテンツ300の生成処理を行う。例えば演算部21はキーモーメントと判定した対象プレーの内容に応じたCG画像を作成する。
 なお、このとき演算部21は、対象プレーについての全てのユーザに共通の1種類のKMコンテンツ300を生成してもよいし、ユーザ又はユーザ群毎に、対象プレーに関して複数種類のKMコンテンツ300を生成してもよい。
When it is determined by the above key moment determination that a key moment has not occurred, the calculation unit 21 returns from step S104 to step S102 in FIG.
When it is determined that a key moment has occurred, the calculation unit 21 proceeds from step S104 to step S105 and performs a process of generating the KM content 300. For example, the calculation unit 21 creates a CG image according to the content of the target play determined to be a key moment.
At this time, the calculation unit 21 may generate one type of KM content 300 common to all users for the target play, or may generate multiple types of KM content 300 for the target play for each user or user group.

 ステップS106で演算部21は、KMコンテンツ300が視聴可能であることを処理対象のユーザの端末装置5に通知する。例えば端末装置5において図7のキーモーメント通知画面121を表示させる制御を行う。
 なお共通キーモーメントの場合は、全ログインユーザに通知されることになり、個別キーモーメントの場合は、ログインユーザのうちで該当するユーザに通知されることになる。
In step S106, the calculation unit 21 notifies the terminal device 5 of the user to be processed that the KM content 300 is viewable. For example, the calculation unit 21 controls the terminal device 5 to display the key moment notification screen 121 of FIG.
In the case of a common key moment, all logged-in users will be notified, and in the case of an individual key moment, only the relevant user among the logged-in users will be notified.

 キーモーメント通知画面121をみたユーザは、視聴ボタン124で、KMコンテンツ300を視聴するか否かを選択できる。
 ユーザが視聴ボタン124を操作しなかった場合、例えば閉ボタン123を操作した場合は、その演算部21はステップS107からステップS102に戻る。
A user who sees the key moment notification screen 121 can select whether or not to view the KM content 300 by using the view button 124 .
If the user does not operate the view button 124, for example, if the user operates the close button 123, the calculation unit 21 returns from step S107 to step S102.

 ユーザが視聴ボタン124を操作した場合は、演算部21はステップS108に進み、処理対象のユーザが現地ユーザか否かを判定する。現地ユーザ、即ちスタジアムで観戦しているユーザの場合は、演算部21はステップS109に進み、端末装置5で図8のようなKMコンテンツ300の再生を実行させる。この場合、ユーザが直接観戦していると推定できるので、図9のような観戦ボタン310は表示させない。 If the user operates the watch button 124, the calculation unit 21 proceeds to step S108 and determines whether the user being processed is a local user. If the user is a local user, i.e., a user watching the game at the stadium, the calculation unit 21 proceeds to step S109 and causes the terminal device 5 to play back the KM content 300 as shown in FIG. 8. In this case, it can be assumed that the user is watching the game in person, so the watch button 310 as shown in FIG. 9 is not displayed.

 ユーザの位置情報から、ユーザが現地ユーザではないと判定した場合は、演算部21はステップS110に進み、KMコンテンツ300の再生を実行させる。この場合、ユーザが観戦しているか否かは不明であるため図9のような観戦ボタン310を表示させる。 If it is determined from the user's location information that the user is not a local user, the calculation unit 21 proceeds to step S110 and executes playback of the KM content 300. In this case, since it is unclear whether the user is watching the game or not, a watch button 310 as shown in FIG. 9 is displayed.

 ステップS109でKMコンテンツ300を再生させた後、或いはステップS110でKMコンテンツ300を再生させた後の所定時間以内にユーザが観戦ボタン310を操作した場合は、演算部21はステップS120に進む。これは、ユーザがキーモーメントを観戦していたと推定できる場合である。
 この場合、対象プレーに立ち会っていた(直接的又は間接的に目撃した)こと、及びKMコンテンツ300を視聴したことの記念として、ユーザにKMコンテンツ300を進呈することとする。具体的には、ユーザの状況に応じてユーザに進呈する編集後KMコンテンツ300Sを生成する。
If the user operates the watch button 310 within a predetermined time after the KM content 300 is played in step S109 or after the KM content 300 is played in step S110, the calculation unit 21 proceeds to step S120. This is the case where it can be estimated that the user was watching a key moment.
In this case, the KM content 300 is presented to the user as a memento of having been present at the target play (witnessing it directly or indirectly) and having viewed the KM content 300. Specifically, an edited KM content 300S to be presented to the user is generated according to the user's situation.

 編集後KMコンテンツ300Sの生成処理例を図20から図23で説明する。
 図20は図10のように観戦アイコン301を付加する例である。
 ステップS201で演算部21はユーザの観戦態様に応じた観戦アイコン301を設定する。つまりスタジアムでの直接観戦か、放送等の視聴による間接観戦かを判定する。そして判定に応じて図10のような観戦アイコン301、或いは図12のような観戦アイコン301を設定する。
An example of the generation process of the edited KM content 300S will be described with reference to FIG. 20 to FIG.
FIG. 20 shows an example in which a spectator icon 301 is added as in FIG.
In step S201, the calculation unit 21 sets a watching icon 301 according to the user's watching style. That is, it determines whether the user is watching the game directly at the stadium or indirectly by viewing a broadcast, etc. Then, depending on the determination, it sets a watching icon 301 as shown in Fig. 10 or a watching icon 301 as shown in Fig. 12.

 ステップS202で演算部21は、KMコンテンツ300に観戦アイコン301を付加する処理を行い、それを編集後KMコンテンツ300Sとする。 In step S202, the calculation unit 21 performs a process of adding a spectator icon 301 to the KM content 300, and sets the result as edited KM content 300S.

 図21は、図15のようにコンテンツ内容も変更する例である。
 ステップS201で演算部21はユーザの観戦態様に応じた観戦アイコン301を設定する。ステップS210で演算部21は、ユーザ情報に応じてコンテンツ内容を変更する。例えばユーザの座席位置、座席エリアの情報を取得し、それに応じたユーザの視点位置を判定する。そしてKMコンテンツ300の画像内容を、ユーザの視点位置からの画像に変更する。或いはユーザがプレーに関する選手のファンであった場合は、その選手をフィーチェアする画像に変更することや、その選手の視点でプレーを見たような画像に変更することも考えられる。
FIG. 21 shows an example in which the contents are also changed as in FIG.
In step S201, the calculation unit 21 sets a spectating icon 301 according to the user's spectating style. In step S210, the calculation unit 21 changes the content according to the user information. For example, information on the user's seat position and seat area is acquired, and the user's viewpoint position is determined accordingly. Then, the image content of the KM content 300 is changed to an image from the user's viewpoint position. Alternatively, if the user is a fan of a player involved in the play, it is possible to change the image to one that features the player, or to one that looks like the play is seen from the player's viewpoint.

 ステップS202で演算部21は、内容を変更したKMコンテンツ300に観戦アイコン301を付加する処理を行い、それを編集後KMコンテンツ300Sとする。 In step S202, the calculation unit 21 performs a process of adding a spectator icon 301 to the KM content 300 whose contents have been changed, and sets the result as the edited KM content 300S.

 図22は、図13,図14のような記念アイコン302を付加する例である。
 ステップS201で演算部21はユーザの観戦態様に応じた観戦アイコン301を設定する。ステップS220で演算部21は、ユーザ情報を確認し、記念アイコン302を設定する。例えば誕生日等の日時的な記念アイコン302や、観戦場所などの位置的な記念アイコン302を設定する。ユーザ情報によっては特に記念アイコン302が設定されない場合もある。
 ステップS202で演算部21は、KMコンテンツ300に観戦アイコン301及び記念アイコン302を付加する処理を行い、それを編集後KMコンテンツ300Sとする。
FIG. 22 shows an example in which a commemorative icon 302 like that shown in FIG. 13 and FIG. 14 is added.
In step S201, the calculation unit 21 sets a watching icon 301 according to the user's watching style. In step S220, the calculation unit 21 checks the user information and sets a commemorative icon 302. For example, a commemorative icon 302 based on date and time such as a birthday, or a commemorative icon 302 based on a position such as a watching location may be set. Depending on the user information, a particular commemorative icon 302 may not be set.
In step S202, the calculation unit 21 performs a process of adding the watching icon 301 and the commemorative icon 302 to the KM content 300, and creates the edited KM content 300S.

 図23は図16や図17のようにアバターを用いる例である。
 ステップS201で演算部21はユーザの観戦態様に応じた観戦アイコン301を設定する。ステップS230で演算部21は、ユーザのアバターを用いてKMコンテンツ300を変更する。例えばアバター303を付加したり、KMコンテンツ300の画像内容としてアバター303を用いたりする編集を行う。
 ステップS202で演算部21は、アバターを用いて変更したKMコンテンツ300に観戦アイコン301を付加する処理を行い、それを編集後KMコンテンツ300Sとする。
FIG. 23 shows an example in which an avatar is used like in FIG. 16 and FIG.
In step S201, the calculation unit 21 sets a watching icon 301 according to the watching mode of the user. In step S230, the calculation unit 21 modifies the KM content 300 using the user's avatar. For example, editing is performed by adding an avatar 303 or using the avatar 303 as the image content of the KM content 300.
In step S202, the calculation unit 21 performs a process of adding a spectating icon 301 to the KM content 300 that has been modified using the avatar, and sets the resulting KM content 300S after editing.

 演算部21は例えば以上の図20から図23のような処理を図18のステップS120で行う。
 ステップS121で演算部21は、編集後KMコンテンツ300Sに対応するアイテム203Aを設定する。
The calculation unit 21 performs the processes shown in FIGS. 20 to 23 in step S120 of FIG.
In step S121, the calculation unit 21 sets the item 203A corresponding to the edited KM content 300S.

 演算部21は、以上のステップS103からステップS121までの処理を試合中に、ユーザ毎に行う。従って、試合中にキーモーメントが発生する毎にKMコンテンツ300が提供され、またユーザの視聴などに応じてユーザに進呈する編集後KMコンテンツ300Sの生成が行われる。 The calculation unit 21 performs the above steps S103 to S121 for each user during the match. Therefore, every time a key moment occurs during the match, the KM content 300 is provided, and edited KM content 300S is generated to be presented to the user in response to the user's viewing, etc.

 試合終了を検知したら、演算部21はステップS102からステップS125に進み、編集後KMコンテンツ300S及びアイテム203Aをユーザに付与する処理を行う。これによってユーザは観戦者として立ち会ったキーモーメントのプレーに関するアイテム203A及び編集後KMコンテンツ300Sを取得できる。
 ユーザは、取得したアイテム203Aをロッカールーム画面140に配置したり、図示しないコレクション画面などにおいて確認したりすることができる。
When the end of the match is detected, the calculation unit 21 proceeds from step S102 to step S125, and performs a process of granting the edited KM content 300S and the item 203A to the user. This allows the user to obtain the item 203A and the edited KM content 300S related to the play of the key moment that the user witnessed as a spectator.
The user can place the acquired item 203A on the locker room screen 140 or view it on a collection screen (not shown).

 図24はアイテム203Aに対する操作に応じた処理例を示している。
 アイテム203Aの操作を検知した場合、演算部21はステップS301からステップS302に進み、ユーザの当該操作に対応する編集後KMコンテンツ300Sを特定する。そしてステップS303で演算部21は、その編集後KMコンテンツ300Sが当該操作を行ったユーザの端末装置5において再生されるようにする再生処理を行う。
 これによりユーザは、取得した編集後KMコンテンツ300Sを任意に視聴することができる。
FIG. 24 shows an example of processing in response to an operation on item 203A.
When an operation of the item 203A is detected, the calculation unit 21 proceeds from step S301 to step S302, and identifies the post-edit KM content 300S corresponding to the user's operation. Then, in step S303, the calculation unit 21 performs a playback process to play back the post-edit KM content 300S on the terminal device 5 of the user who performed the operation.
This allows the user to view the acquired edited KM content 300S at will.

<4.まとめ及び変形例>
 以上の実施の形態では次のような効果が得られる。
4. Summary and Modifications
The above embodiment provides the following advantages.

 実施の形態のメインサーバ2としての情報処理装置70は、競技の試合中に、試合に関する情報である試合情報に基づいて、KMコンテンツの生成対象とされる対象プレーが発生したか否かを判定する判定処理と、対象プレーの発生の判定に応じて、対象プレーに関するKMコンテンツ300を生成する生成処理と、KMコンテンツ300をサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後KMコンテンツ300Sを生成する編集処理と、を行う演算部21を備えている。
 これによりユーザは、キーモーメントと判定されたプレーに関するKMコンテンツ300を視聴できる。またユーザは、例えば観戦アイコン301が付加されるなどの内容が付加・変更された編集後KMコンテンツ300Sを取得できる。つまりキーモーメント及びそのときのユーザの状況を示す様な編集後KMコンテンツ300Sを取得できる。これは、ユーザが、試合中のプレーにおいて感動や興奮を与えられたプレーの内容とともに、自分の状況を反映したメモリアルな編集後KMコンテンツ300Sを取得できることとなり、試合観戦に連携した新たな楽しみをユーザに提供できるものとなる。
 なお、判定処理、生成処理、編集処理は、複数の情報処理装置で分担して行われてもよい。
The information processing device 70 serving as the main server 2 in the embodiment includes a calculation unit 21 that performs a determination process for determining whether or not a target play for which KM content is to be generated has occurred during a competitive match, based on match information, which is information related to the match; a generation process for generating KM content 300 related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited KM content 300S by editing the KM content 300 based on user information related to the user to whom the service is provided.
This allows the user to view the KM content 300 related to the play determined to be a key moment. The user can also obtain edited KM content 300S to which content has been added or changed, for example by adding a spectator icon 301. In other words, edited KM content 300S that shows the key moment and the user's situation at that time can be obtained. This allows the user to obtain memorable edited KM content 300S that reflects the user's situation as well as the content of the play that moved or excited the user during the game, providing the user with a new way to enjoy watching the game.
The determination process, generation process, and editing process may be shared among a plurality of information processing devices.

 実施の形態では、演算部21が、対象プレーが発生したと判定することに応じて、試合情報に基づいて対象プレーに関するKMコンテンツ300を生成するとともに、サービス提供先の端末装置5にコンテンツ視聴可能であることを通知する処理を行う例を述べた(図7、図18のステップS106参照)。
 これによりユーザは、キーモーメントと判定されたプレーに関するKMコンテンツ300が生成された場合に、試合中や試合後にKMコンテンツ300の存在を知り、視聴できるようになる。例えばユーザはキーモーメントの直後に通知を受け、そのプレーのKMコンテンツ300の存在を知り、直前に観ていた特別な瞬間についてのコンテンツを楽しめる。
In the embodiment, an example has been described in which, upon determining that a target play has occurred, the calculation unit 21 generates KM content 300 related to the target play based on match information, and performs processing to notify the service destination terminal device 5 that the content can be viewed (see step S106 in Figures 7 and 18).
This allows the user to know the existence of the KM content 300 and view it during or after the match when the KM content 300 is generated for a play determined to be a key moment. For example, the user is notified immediately after a key moment, knows the existence of the KM content 300 for that play, and can enjoy the content about the special moment he or she just watched.

 実施の形態では、演算部21が、サービス提供先のユーザに関するユーザ情報に基づいて内容の付加又は変更を行った編集後KMコンテンツ300Sに対応するアイテム203Aを生成し、編集後KMコンテンツ300Sとアイテム203Aをユーザに関連づける処理を行う例を挙げた(図18のステップS125参照)。
 これによりユーザが、編集後KMコンテンツ300Sとアイテム203Aを取得した状態にすることができる。例えばユーザは自分の状況とキーモーメントが表現された唯一無二の編集後KMコンテンツ300Sとアイテム203Aを取得でき、観戦した試合の思い出の品とすることができる。
In the embodiment, an example is given in which the calculation unit 21 generates an item 203A corresponding to the edited KM content 300S to which content has been added or changed based on user information regarding the user to whom the service is provided, and performs a process of associating the edited KM content 300S and the item 203A with the user (see step S125 in Figure 18).
This allows the user to acquire the edited KM content 300S and the item 203A. For example, the user can acquire the one and only edited KM content 300S and the item 203A that express his/her own situation and key moment, and use them as mementos of the game he/she watched.

 実施の形態では、演算部21が、アイテム203Aの操作により、編集後KMコンテンツ300Sを再生させる処理を行う例を挙げた(図24参照)。
 例えばアイテム203Aを指定する操作により、そのアイテム203Aに対応する編集後KMコンテンツ300Sが端末装置5において再生されるようにする。これにより編集後KMコンテンツ300Sとアイテム203Aを取得したユーザは任意の時点で編集後KMコンテンツ300Sを視聴でき、試合でのキーモーメントや自分の状況を思い起こすことができる。
In the embodiment, an example has been given in which the calculation unit 21 performs processing for playing back the edited KM content 300S by operating the item 203A (see FIG. 24).
For example, by performing an operation to designate an item 203A, the edited KM content 300S corresponding to that item 203A is played on the terminal device 5. In this way, a user who has acquired the edited KM content 300S and the item 203A can view the edited KM content 300S at any time and recall key moments in a match or his/her own situation.

 実施の形態では、演算部21が、KMコンテンツ300の生成処理において、対象プレーに関連する人(選手や審判)又は競技物の動きを表現するKMコンテンツ300を生成する例を挙げた。
 例えば対象プレーに関連する人の動きとして、選手の投球フォーム、打撃フォーム、走塁や、審判のジェスチャーなどがある。競技物の動きとしてはボールなどの軌跡が考えられる。演算部21は選手やボール等のトラッキングデータに基づいて、これらの動きを表現するCGアニメーションとしてKMコンテンツ300を生成する。或いは動きの撮像画像を用いた動画、静止画、スライドショウなどとしてKMコンテンツ300を生成することもできる。さらに撮像画像とCGを合成したKMコンテンツ300を生成してもよい。これにより対象プレーをダイナミックに表現するKMコンテンツ300をユーザに提供できる。
 なお競馬における馬など、試合に参加する動物が存在する場合は、動物の動きを表現するコンテンツを生成することが考えられる。
In the embodiment, an example has been given in which the calculation unit 21 generates the KM content 300 that expresses the movement of a person (player or referee) or a playing object related to a target play in the process of generating the KM content 300.
For example, human movements related to the target play include a player's pitching form, batting form, and base running, and an umpire's gestures. A possible example of the movement of a playing object is the trajectory of a ball or the like. The calculation unit 21 generates the KM content 300 as a CG animation expressing these movements based on tracking data of the players, ball, etc. Alternatively, the KM content 300 can be generated as a video, still image, slide show, etc. using captured images of the movements. Furthermore, the KM content 300 may be generated by combining captured images and CG. This makes it possible to provide the user with the KM content 300 that dynamically expresses the target play.
When there are animals participating in a race, such as horses in a horse race, it is possible to generate content that depicts the movements of the animals.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う例を挙げた。
 例えばスタジアムで観戦しているユーザや、テレビジョン放送やネットワーク配信などで観戦しているユーザについて、そのプレーを観ていたことを示す観戦アイコン301をKMコンテンツ300に付加する。
 これによりユーザは、後に編集後KMコンテンツ300Sを視聴する際に、自分がリアルタイムで観ていたプレーであることを認識できる。また編集後KMコンテンツ300Sの所有は、自分がメモリアルなプレーに立ち会った事実を記憶したり証明したりするものとなり、ユーザにとって価値のあるデジタルコンテンツとなる。
In the embodiment, an example has been given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image representing that the user has been watching the target play to the content.
For example, for a user who is watching a game at a stadium or a user who is watching a game via television broadcast or network distribution, a spectator icon 301 indicating that the user has been watching the play is added to the KM content 300.
This allows the user to recognize that the play was what he or she saw in real time when viewing the edited KM content 300S later. Furthermore, the ownership of the edited KM content 300S serves as a memorization or proof of the fact that the user witnessed a memorable play, making the edited KM content 300S a valuable digital content for the user.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザが対象プレーを観戦していたことを表現する画像として、直接観戦か間接観戦かを示す画像をコンテンツに付加する編集を行う例を挙げた(図10,図12参照)。
 例えばスタジアムで直接観戦しているユーザと、テレビジョン放送やネットワーク配信などで間接的に観戦しているユーザとで、デザインが異なる観戦アイコン301をKMコンテンツ300に付加する。これにより編集後KMコンテンツ300Sは、ユーザがどのような観戦態様で観戦したかを記録したものとできる。
In the embodiment, an example is given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image indicating whether the user was watching the target play directly or indirectly to the content as an image representing that the user was watching the target play (see Figures 10 and 12).
For example, a viewing icon 301 with a different design is added to the KM content 300 for a user who is watching the game directly at the stadium and a user who is watching the game indirectly via television broadcasting, network distribution, etc. In this way, the edited KM content 300S can be a record of the viewing style in which the user watched the game.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザの位置情報に基づいて、又はユーザがコンテンツに関する所定操作を行ったことに応じて、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う例を挙げた(図9参照)。
 例えばユーザの位置情報により、対象プレーがあったときにユーザがスタジアムに居ることを推定できる場合や、コンテンツに対して観戦ボタン310を押す操作を行うなどユーザが観戦していたことの意思表示を行った場合に、観戦アイコン301をKMコンテンツ300に付加するようにする。これによりユーザが観戦していたことを示す編集後KMコンテンツ300Sを生成できる。
In the embodiment, an example is given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image to the content representing that the user was watching the target play based on the user's location information or in response to the user performing a specified operation related to the content (see Figure 9).
For example, when it is possible to estimate from the user's location information that the user was in the stadium when the target play occurred, or when the user indicates his/her intention to watch the game by pressing the watch button 310 on the content, the watch icon 301 is added to the KM content 300. This makes it possible to generate an edited KM content 300S that indicates that the user was watching the game.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザ情報に基づいて、対象プレーの発生日時に関連するユーザの情報を表現する画像をコンテンツに付加する編集を行う例を挙げた(図13参照)。
 例えば対象プレーが発生した日が、ユーザの誕生日であった場合、誕生日であることを示す記念アイコン302をKMコンテンツ300に付加する。これにより編集後KMコンテンツ300Sは、ユーザがどのような日時に対象プレーに立ち会ったかを記録するものとできる。例えばユーザに観戦の思い出を呼び起こすものともできる。
 対象プレーの発生日時に関連するユーザの情報としては、結婚記念日、子供の誕生日など、ユーザ情報として各種の記念日が判定できる場合は、それらの記念日や記念の時刻に応じた記念アイコン302を付加することも考えられる。
In the embodiment, an example was given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image representing user information related to the date and time of occurrence of the target play to the content based on user information (see Figure 13).
For example, if the day on which the target play occurred is the user's birthday, a commemorative icon 302 indicating the birthday is added to the KM content 300. In this way, the edited KM content 300S can record the date and time on which the user witnessed the target play. For example, it can also evoke memories of watching a game for the user.
As user information related to the date and time of the target play, if various anniversaries such as wedding anniversaries and children's birthdays can be determined as user information, it is also possible to add commemorative icons 302 corresponding to those anniversaries and the times of the commemorations.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザ情報に基づいて、対象プレーの観戦場所を表現する画像をコンテンツに付加する編集を行う例を挙げた(図14参照)。
 対象プレーが発生したときにユーザがどこで観戦していたかを示すような観戦アイコン301をKMコンテンツ300に付加する。例えばスタジアムで観戦していた場合には、座席位置、座席エリア等がわかる画像を付加する。放送や配信での観戦でも、例えば自宅での観戦、パブリックビューイングでの観戦、スポーツバーなどの特定の店舗での観戦などが表現される画像を付加する。これにより編集後KMコンテンツ300Sは、ユーザ自身がどこに居て観戦していたかを記録するものとできる。例えばユーザに観戦の思い出を呼び起こすものともできる。
In the embodiment, an example has been given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image representing the viewing location of the target play to the content based on the user information (see FIG. 14).
A spectator icon 301 that indicates where the user was watching the game when the target play occurred is added to the KM content 300. For example, if the game was watched at a stadium, an image showing the seat position, seat area, etc. is added. Even if the game was watched via broadcast or distribution, an image that shows, for example, watching the game at home, watching the game at a public viewing, or watching the game at a specific establishment such as a sports bar is added. In this way, the edited KM content 300S can record where the user himself was watching the game. For example, it can also be used to evoke memories of watching the game in the user's mind.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザ自身を表現する画像をコンテンツに付加する編集を行う例を挙げた(図16参照)。
 例えばユーザのアバター303、写真画像などをKMコンテンツ300に付加する。これにより編集後KMコンテンツ300Sは、ユーザ自身の画像を含む唯一無二のものとなり、ユーザにとっての価値を高めることができる。
In the embodiment, an example has been given in which the calculation unit 21 performs editing processing on the KM content 300 to add an image representing the user to the content (see FIG. 16).
For example, a user's avatar 303, a photographic image, etc. are added to the KM content 300. This makes the edited KM content 300S unique, including the user's own image, and can increase its value to the user.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、ユーザの観戦場所の情報に基づいて、コンテンツ内容を変更する編集を行う例を挙げた(図15参照)。
 例えばKMコンテンツ300のプレーの画像を、ユーザの座席位置に応じて視点を設定した画像となるようにした編集後KMコンテンツ300Sを生成する。これによりユーザは、実際に自分が目撃したプレーを、同様の視点の画像とされた編集後KMコンテンツ300Sで保存できるようになる。
In the embodiment, an example has been given in which the calculation unit 21 performs editing on the KM content 300 to change the content details based on information about the user's viewing location (see FIG. 15).
For example, edited KM content 300S is generated by changing an image of a play in KM content 300 into an image with a viewpoint set according to the seat position of the user. This allows the user to save the play that he or she actually witnessed as edited KM content 300S with an image from the same viewpoint.

 実施の形態では、演算部21が、KMコンテンツ300に対する編集処理で、対象プレーを表現する画像における選手の画像を、ユーザ自身を表現する画像に置き換えるようにコンテンツ内容を変更する編集を行う例を挙げた(図17参照)。
 例えばKMコンテンツ300のプレーにおける選手の画像を、ユーザのアバター303に置き換える。これによりユーザは自身がプレーに参加しているような編集後KMコンテンツ300Sを取得できる。
In the embodiment, an example was given in which the calculation unit 21 performs editing processing on the KM content 300 to change the content content so as to replace an image of a player in an image representing a target play with an image representing the user himself (see Figure 17).
For example, an image of a player in a play in the KM content 300 is replaced with the user's avatar 303. This allows the user to obtain edited KM content 300S in which the user himself/herself appears to be participating in the play.

 なお、実施の形態で上げた各種の編集処理は、組み合わせることもできる。例えばKMコンテンツ300の視点位置等による内容変更、観戦アイコン301の付加、記念アイコン302の付加、アバター303の付加、アバターによる内容変更を全て行うようにしてもよいし、いくつか組み合わせて行っても良い。
 また複数種類の記念アイコン302を付加することや、ユーザ独自のアイコンを付加することが考えられる。内容変更は、視点変更やアバターによる変更だけで無く、プレーの表現自体を異なるものに変更してもよい。
The various editing processes described in the embodiment can be combined. For example, the content change based on the viewpoint position of the KM content 300, the addition of the spectator icon 301, the addition of the commemorative icon 302, the addition of the avatar 303, and the content change based on the avatar may all be performed, or some of them may be combined.
It is also possible to add a plurality of types of commemorative icons 302 or to add a user's own icon. The content change is not limited to a change in viewpoint or avatar, but may also be a change in the expression of the play itself.

 実施の形態では、演算部21が、キーモーメントの判定処理で、試合情報に基づいて判定される特定種別のプレーを対象プレーと判定する例を挙げた(図18のステップS103,図19参照)。
 例えば競技に応じて、KMコンテンツ300の生成のトリガとなる対象プレーの種別を設定しておく。野球で言えば、例えばホームラン、タイムリーヒット、盗塁、その他の得点シーンのプレー、奪三振、ダブルプレー、トリプルプレー、ファインプレーなどを特定種別のプレーとする。そして演算部21は試合経過情報からこれらのプレーを検知することで、KMコンテンツ300を生成する。これにより適切なキーモーメントの検知及びKMコンテンツ300の生成が可能になる。
In the embodiment, an example has been given in which the calculation unit 21 determines, in the key moment determination process, a specific type of play determined based on the match information as a target play (see step S103 in FIG. 18 and FIG. 19).
For example, the type of target play that will trigger the generation of the KM content 300 is set according to the game. In baseball, for example, a home run, a hit, a stolen base, other scoring plays, a strikeout, a double play, a triple play, a fine play, etc. are specific types of plays. The calculation unit 21 then generates the KM content 300 by detecting these plays from the game progress information. This makes it possible to detect appropriate key moments and generate the KM content 300.

 実施の形態では、演算部21が、キーモーメントの判定処理では、試合情報とユーザ情報に基づいて対象プレーの発生を判定する例を挙げた(図19参照)。
 例えばユーザ情報として、ユーザが応援する特定の選手、特定のチームなどを判定する。そして試合経過情報から、その選手のプレーを検出したときに、それを対象プレーと判定したり、ユーザの応援チームにおける特定種別のプレーを検出したときに、それを対象プレーと判定したりする。これによりユーザにとっての個人的なキーモーメントを拾い上げて、それに応じたKMコンテンツ300(300S)を生成できる。編集後KMコンテンツ300Sは、ユーザにとって、より試合観戦の思い出の詰まったものとなる。
In the embodiment, an example has been given in which the calculation unit 21 determines the occurrence of a target play based on the match information and user information in the key moment determination process (see FIG. 19 ).
For example, a specific player or team that the user supports is determined as user information. When a play of that player is detected from the game progress information, it is determined as a target play, or when a specific type of play of the user's supported team is detected, it is determined as a target play. This makes it possible to pick up personal key moments for the user and generate KM content 300 (300S) accordingly. The edited KM content 300S will be filled with more memories of watching the game for the user.

 実施の形態では、ゲームコンテンツが、野球又はソフトボールの試合のボールのトラッキングデータに基づいて、ゲーム空間内でのボールの動きが設定されたものである例を挙げた。
 投手の投球としてのボールのトラッキングデータや、打球のトラッキングデータなどに基づいてゲームコンテンツ上でのボールの動きが表現されるようにする。これにより野球やソフトボールにおけるボールを体験できるゲームコンテンツを提供できる。
In the embodiment, an example has been given in which the game content is such that the movement of a ball within a game space is set based on tracking data of the ball in a baseball or softball game.
The movement of the ball in the game content is expressed based on tracking data of the ball as pitched by the pitcher, tracking data of the batted ball, etc. This makes it possible to provide game content that allows the user to experience balls in baseball or softball.

 なお試合連携サービスに関して野球の試合に連携する例で説明してきたが、試合連携サービスを適用できる競技は多様である。例えばサッカー、アメリカンフットボール、ラグビー、バスケットボール、バレーボール、テニス、ゴルフなどがある。特にEPTSシステムを利用している競技であると球技が好適である。もちろん球技以外の競技でもよい。例えば体操、スケート、陸上競技、水泳、柔道、剣道、ボクシング、フェンシング等でも利用できる。 Although the match linkage service has been explained using the example of linking with a baseball game, the match linkage service can be applied to a variety of sports. Examples include soccer, American football, rugby, basketball, volleyball, tennis, and golf. Ball games are particularly suitable as sports that use the EPTS system. Of course, it can also be used for sports other than ball games. For example, it can be used for gymnastics, skating, track and field, swimming, judo, kendo, boxing, fencing, etc.

 キーモーメントとしてのプレーの判定や、KMコンテンツ300の内容、編集後KMコンテンツ300Sやアイテム203Aの内容は競技に応じて考えられる。サッカーであればゴールシーン、フリーキックシーンなどをキーモーメントとすることが想定される。
 アイテム203Aは、サッカーであればサッカーボール、ボクシングであればボクシンググローブ、テニスであればラケットなども考えられる。
 試合情報も競技毎に想定される。
The judgment of a play as a key moment, the contents of the KM content 300, the edited KM content 300S, and the contents of the item 203A may be considered according to the sport. In soccer, it is assumed that a goal scene, a free kick scene, etc. are set as a key moment.
Item 203A may be a soccer ball for soccer, a boxing glove for boxing, or a racket for tennis.
Match information is also anticipated for each sport.

 また試合情報については野球のスタッツデータの例を示したが、もちろん競技により異なる。サッカーであれば、スタッツデータは次の様なものがある。
・ゴール数:シーズン中のゴール数
・シュート数:シュートを打った数
・枠内シュート数:ゴールの枠内に打ったシュート数
・ラストパス数:ゴールチャンスになったパス数
・ワンタッチパス数:ワンタッチでパスした数
・空中戦数:空中にあるボールを競り合った数
・タックル数:ボール保持者に対してボールを取りに行った数
・走行距離:各選手についての試合中に走った距離
As for game information, we have given an example of baseball stats data, but of course it differs depending on the sport. For soccer, the stats data is as follows:
・Goals: The number of goals scored in the season ・Shots: The number of shots taken ・Shots on target: The number of shots taken inside the goal ・Last passes: The number of passes that resulted in goal-scoring opportunities ・One-touch passes: The number of passes made with one touch ・Aerial battles: The number of contests for the ball in the air ・Tackles: The number of attempts to retrieve the ball against the ball carrier ・Distance traveled: The distance traveled during the match for each player

 そしてサッカーの場合、キーモーメントとしては、ゴールシーン、コーナーキックシーン、フリーキックシーン、ペナルティキックシーン、ゴールキーパーのセーブシーンなどの各プレーや、接触プレーなどが例示される。 In the case of soccer, key moments include goals, corner kicks, free kicks, penalty kicks, goalkeeper saves, and contact plays.

 なお実施の形態では主にメインサーバ2の演算部21の処理として説明したが、演算部21の処理の全部又は一部は、端末装置5側のCPUによって実行されるものであってもよい。例えば演算部21の試合連携サービス機能40がアプリケーションプログラムとして端末装置5にインストールされ、例えば図3のアイテム/コンテンツ生成部45やロッカー処理部47、ゲーム処理部48による処理などが端末装置5において行われる例も考えられる。
 他の機能、即ち試合情報取得部41、ユーザ情報取得部42、試合解析部43、クイズ生成部44、記憶制御部46等の機能を端末装置5が備え、これらによる処理が端末装置5で行われてもよい。
In the embodiment, the processing has been mainly described as that of the calculation unit 21 of the main server 2, but all or part of the processing of the calculation unit 21 may be executed by a CPU on the terminal device 5. For example, the match linkage service function 40 of the calculation unit 21 may be installed in the terminal device 5 as an application program, and processing by the item/content generation unit 45, locker processing unit 47, game processing unit 48, etc. in FIG. 3 may be performed on the terminal device 5.
The terminal device 5 may be equipped with other functions, such as a match information acquisition unit 41, a user information acquisition unit 42, a match analysis unit 43, a quiz generation unit 44, a memory control unit 46, etc., and processing by these functions may be performed by the terminal device 5.

 実施の形態のプログラムは、図18から図24のような処理を、例えばCPU、DSP(digital signal processor)、AIプロセッサ等、或いはこれらを含む情報処理装置70に実行させるプログラムである。
 即ち実施の形態のプログラムは、競技の試合中に、試合に関する情報である試合情報に基づいて、KMコンテンツ300の生成対象とされる対象プレーが発生したか否かを判定する判定処理と、対象プレーの発生の判定に応じて対象プレーに関するKMコンテンツ300を生成する生成処理と、KMコンテンツに対してサービス提供先のユーザに関するユーザ情報に基づいて内容を付加又は変更した編集後KMコンテンツ300Sを生成する編集処理と、を情報処理装置に実行させるプログラムである。
The program of the embodiment is a program that causes, for example, a CPU, a DSP (digital signal processor), an AI processor, or an information processing device 70 including these to execute the processes shown in FIGS.
In other words, the program of the embodiment is a program that causes an information processing device to execute a determination process for determining whether or not a target play that is to be the subject of generation of KM content 300 has occurred during a competitive match, based on match information, which is information about the match; a generation process for generating KM content 300 related to the target play in accordance with the determination of the occurrence of the target play; and an editing process for generating edited KM content 300S in which content has been added to or changed based on user information about the user to whom the service is provided.

 このようなプログラムにより、実施の形態の情報提供システム1を構成する情報処理装置70を、例えばコンピュータ装置、携帯端末装置、その他の情報処理が実行できる機器において実現できる。 With such a program, the information processing device 70 constituting the information provision system 1 of the embodiment can be realized, for example, in a computer device, a mobile terminal device, or other device capable of executing information processing.

 このようなプログラムは、コンピュータ装置等の機器に内蔵されている記録媒体としてのHDDや、CPUを有するマイクロコンピュータ内のROM等に予め記録しておくことができる。
 あるいはまたプログラムは、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magneto Optical)ディスク、DVD(Digital Versatile Disc)、ブルーレイディスク(Blu-ray Disc(登録商標))、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。
 また、このようなプログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
Such a program can be recorded in advance in a HDD serving as a recording medium built into a device such as a computer device, or in a ROM within a microcomputer having a CPU.
Alternatively, the program may be temporarily or permanently stored (recorded) on a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, a memory card, etc. Such removable recording media may be provided as so-called package software.
Furthermore, such a program can be installed in a personal computer or the like from a removable recording medium, or can be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.

 またこのようなプログラムによれば、実施の形態の情報提供システム1を構成する情報処理装置70の広範な提供に適している。例えばスマートフォンやタブレット等の携帯端末装置、撮像装置、携帯電話機、パーソナルコンピュータ、ゲーム機器、ビデオ機器、PDA(Personal Digital Assistant)等にプログラムをダウンロードすることで、これらの機器を、本開示の情報提供システム1を構成する情報処理装置70として機能させることができる。 Furthermore, such a program is suitable for a wide range of provision of the information processing device 70 constituting the information provision system 1 of the embodiment. For example, by downloading the program to mobile terminal devices such as smartphones and tablets, imaging devices, mobile phones, personal computers, game devices, video devices, PDAs (Personal Digital Assistants), etc., these devices can function as the information processing device 70 constituting the information provision system 1 of the present disclosure.

 なお、本明細書に記載された効果はあくまでも例示であって限定されるものではなく、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.

 なお本技術は以下のような構成も採ることができる。
 (1)
 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を行う演算部を備えた
 情報処理装置。
 (2)
 前記演算部は、
 前記対象プレーが発生したと判定することに応じて、試合情報に基づいて前記対象プレーに関するコンテンツを生成するとともに、サービス提供先の装置にコンテンツ視聴可能であることを通知する処理を行う
 上記(1)に記載の情報処理装置。
 (3)
 前記演算部は、
 サービス提供先のユーザに関するユーザ情報に基づいて内容の付加又は変更を行った編集後コンテンツに対応するデジタルアイテムを生成し、前記編集後コンテンツと前記デジタルアイテムをユーザに関連づける処理を行う
 上記(1)又は(2)に記載の情報処理装置。
 (4)
 前記演算部は、
 前記デジタルアイテムの操作により、前記編集後コンテンツを再生させる処理を行う
 上記(3)に記載の情報処理装置。
 (5)
 前記演算部は、
 前記生成処理では、前記対象プレーに関連する人、動物、又は競技物の動きを表現するコンテンツを生成する
 上記(1)から(4)のいずれかに記載の情報処理装置。
 (6)
 前記演算部は、
 前記編集処理では、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う
 上記(1)から(5)のいずれかに記載の情報処理装置。
 (7)
 前記演算部は、
 前記編集処理では、ユーザが対象プレーを観戦していたことを表現する画像として、直接観戦か間接観戦かを示す画像をコンテンツに付加する編集を行う
 上記(1)から(6)のいずれかに記載の情報処理装置。
 (8)
 前記演算部は、
 前記編集処理では、ユーザの位置情報に基づいて、又はユーザがコンテンツに関する所定操作を行ったことに応じて、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う
 上記(1)から(7)のいずれかに記載の情報処理装置。
 (9)
 前記演算部は、
 前記編集処理では、ユーザ情報に基づいて、対象プレーの発生日時に関連するユーザの情報を表現する画像をコンテンツに付加する編集を行う
 上記(1)から(8)のいずれかに記載の情報処理装置。
 (10)
 前記演算部は、
 前記編集処理では、ユーザ情報に基づいて、対象プレーの観戦場所を表現する画像をコンテンツに付加する編集を行う
 上記(1)から(9)のいずれかに記載の情報処理装置。
 (11)
 前記演算部は、
 前記編集処理では、ユーザ自身を表現する画像をコンテンツに付加する編集を行う
 上記(1)から(10)のいずれかに記載の情報処理装置。
 (12)
 前記演算部は、
 前記編集処理では、ユーザの観戦場所の情報に基づいて、コンテンツ内容を変更する編集を行う
 上記(1)から(11)のいずれかに記載の情報処理装置。
 (13)
 前記演算部は、
 前記編集処理では、対象プレーを表現する画像における選手の画像を、ユーザ自身を表現する画像に置き換えるようにコンテンツ内容を変更する編集を行う
 上記(1)から(12)のいずれかに記載の情報処理装置。
 (14)
 前記演算部は、
 前記判定処理では、試合情報に基づいて判定される特定種別のプレーを前記対象プレーと判定する
 上記(1)から(13)のいずれかに記載の情報処理装置。
 (15)
 前記演算部は、
 前記判定処理では、試合情報とユーザ情報に基づいて前記対象プレーの発生を判定する
 上記(1)から(14)のいずれかに記載の情報処理装置。
 (16)
 情報処理装置が、
 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を行う情報処理方法。
 (17)
 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を情報処理装置に実行させるプログラム。
The present technology can also be configured as follows.
(1)
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
An information processing device comprising a calculation unit that performs the above.
(2)
The calculation unit is
The information processing device described in (1) above, in response to determining that the target play has occurred, generates content related to the target play based on match information and notifies the service destination device that the content can be viewed.
(3)
The calculation unit is
The information processing device described in (1) or (2) above generates a digital item corresponding to edited content to which content has been added or changed based on user information regarding the user to whom the service is provided, and performs a process of associating the edited content and the digital item with the user.
(4)
The calculation unit is
The information processing device according to (3) above, further comprising: a process for playing back the edited content in response to an operation on the digital item.
(5)
The calculation unit is
The information processing device according to any one of (1) to (4) above, wherein the generation process generates content expressing a movement of a person, an animal, or a playing object related to the target play.
(6)
The calculation unit is
The information processing device according to any one of (1) to (5) above, wherein the editing process includes editing the content by adding an image representing that the user has been watching a target play.
(7)
The calculation unit is
The information processing device described in any one of (1) to (6) above, wherein the editing process includes editing the content by adding an image indicating whether the user was watching the target play directly or indirectly to the content as an image representing that the user was watching the target play.
(8)
The calculation unit is
The editing process includes editing the content by adding an image representing that the user was watching the target play based on the user's location information or in response to the user performing a specified operation related to the content.An information processing device described in any of (1) to (7) above.
(9)
The calculation unit is
The information processing device according to any one of (1) to (8) above, wherein the editing process performs editing to add an image expressing user information related to a date and time of occurrence of a target play to the content based on user information.
(10)
The calculation unit is
The information processing device according to any one of (1) to (9) above, wherein the editing process performs editing to add an image representing a viewing location of a target play to the content based on user information.
(11)
The calculation unit is
The information processing device according to any one of (1) to (10) above, wherein the editing process includes editing for adding an image representing the user to the content.
(12)
The calculation unit is
The information processing device according to any one of (1) to (11) above, wherein the editing process performs editing to change content details based on information about a user's viewing location.
(13)
The calculation unit is
The information processing device according to any one of (1) to (12) above, wherein the editing process involves editing to change content details so as to replace an image of a player in an image representing a target play with an image representing the user himself/herself.
(14)
The calculation unit is
The information processing device according to any one of (1) to (13) above, wherein in the determination process, a specific type of play determined based on match information is determined to be the target play.
(15)
The calculation unit is
The information processing device according to any one of (1) to (14) above, wherein the determination process determines the occurrence of the target play based on match information and user information.
(16)
An information processing device,
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
An information processing method for performing the above.
(17)
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
A program for causing an information processing device to execute the above.

1 情報提供システム
2 メインサーバ
3 スコアデータサーバ
5 端末装置
10 撮像装置
21 演算部
22 提示制御部
45 アイテム/コンテンツ生成部
45a 判定部
45b 生成編集部
45c 提供処理部
70 情報処理装置
71 CPU
203,203A アイテム
300 キーモーメントコンテンツ(KMコンテンツ)
300S 編集後キーモーメントコンテンツ(編集後KMコンテンツ)
301 観戦アイコン
302 記念アイコン
303 アバター
310 観戦ボタン
REFERENCE SIGNS LIST 1 Information provision system 2 Main server 3 Score data server 5 Terminal device 10 Imaging device 21 Calculation unit 22 Presentation control unit 45 Item/content generation unit 45a Determination unit 45b Generation/editing unit 45c Provision processing unit 70 Information processing device 71 CPU
203, 203A Item 300 Key Moment Content (KM Content)
300S Edited Key Moment Content (Edited KM Content)
301 Spectating icon 302 Commemorative icon 303 Avatar 310 Spectating button

Claims (17)

 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を行う演算部を備えた
 情報処理装置。
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
An information processing device comprising a calculation unit that performs the above.
 前記演算部は、
 前記対象プレーが発生したと判定することに応じて、試合情報に基づいて前記対象プレーに関するコンテンツを生成するとともに、サービス提供先の装置にコンテンツ視聴可能であることを通知する処理を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein, in response to determining that the target play has occurred, content relating to the target play is generated based on match information, and a process of notifying a service destination device that the content can be viewed is performed.
 前記演算部は、
 サービス提供先のユーザに関するユーザ情報に基づいて内容の付加又は変更を行った編集後コンテンツに対応するデジタルアイテムを生成し、前記編集後コンテンツと前記デジタルアイテムをユーザに関連づける処理を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing apparatus according to claim 1 , further comprising: generating a digital item corresponding to edited content in which content has been added or changed based on user information relating to a user to whom the service is to be provided; and associating the edited content and the digital item with the user.
 前記演算部は、
 前記デジタルアイテムの操作により、前記編集後コンテンツを再生させる処理を行う
 請求項3に記載の情報処理装置。
The calculation unit is
The information processing apparatus according to claim 3 , wherein a process of playing back the edited content is performed in response to an operation on the digital item.
 前記演算部は、
 前記生成処理では、前記対象プレーに関連する人、動物、又は競技物の動きを表現するコンテンツを生成する
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the generation process generates content expressing a movement of a person, an animal, or a playing object related to the target play.
 前記演算部は、
 前記編集処理では、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing the content by adding an image representing that the user has been watching a target play.
 前記演算部は、
 前記編集処理では、ユーザが対象プレーを観戦していたことを表現する画像として、直接観戦か間接観戦かを示す画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing the content by adding an image indicating whether the user is directly watching or indirectly watching the target play as an image representing that the user is watching the target play.
 前記演算部は、
 前記編集処理では、ユーザの位置情報に基づいて、又はユーザがコンテンツに関する所定操作を行ったことに応じて、ユーザが対象プレーを観戦していたことを表現する画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing the content by adding an image representing that the user was watching a target play based on the user's location information or in response to the user performing a predetermined operation related to the content.
 前記演算部は、
 前記編集処理では、ユーザ情報に基づいて、対象プレーの発生日時に関連するユーザの情報を表現する画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing for adding an image expressing user information related to a date and time of occurrence of a target play to the content based on user information.
 前記演算部は、
 前記編集処理では、ユーザ情報に基づいて、対象プレーの観戦場所を表現する画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing for adding an image representing a viewing location of a target play to the content based on user information.
 前記演算部は、
 前記編集処理では、ユーザ自身を表現する画像をコンテンツに付加する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing apparatus according to claim 1 , wherein the editing process includes editing for adding an image representing the user to the content.
 前記演算部は、
 前記編集処理では、ユーザの観戦場所の情報に基づいて、コンテンツ内容を変更する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process includes editing for changing the content based on information about a viewing location of the user.
 前記演算部は、
 前記編集処理では、対象プレーを表現する画像における選手の画像を、ユーザ自身を表現する画像に置き換えるようにコンテンツ内容を変更する編集を行う
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the editing process performs editing to change content details so as to replace an image of a player in an image representing a target play with an image representing the user himself/herself.
 前記演算部は、
 前記判定処理では、試合情報に基づいて判定される特定種別のプレーを前記対象プレーと判定する
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein in the determination process, a specific type of play determined based on match information is determined to be the target play.
 前記演算部は、
 前記判定処理では、試合情報とユーザ情報に基づいて前記対象プレーの発生を判定する
 請求項1に記載の情報処理装置。
The calculation unit is
The information processing device according to claim 1 , wherein the determination process determines the occurrence of the target play based on match information and user information.
 情報処理装置が、
 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を行う情報処理方法。
An information processing device,
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
An information processing method for performing the above.
 競技の試合中に、試合に関する情報である試合情報に基づいて、コンテンツ生成対象とされる対象プレーが発生したか否かを判定する判定処理と、
 前記対象プレーの発生の判定に応じて、前記対象プレーに関するコンテンツを生成する生成処理と、
 前記コンテンツをサービス提供先のユーザに関するユーザ情報に基づいて編集した編集後コンテンツを生成する編集処理と、
 を情報処理装置に実行させるプログラム。
a determination process for determining whether or not a target play for which content is to be generated has occurred during a competitive match, based on match information that is information related to the match;
a generation process for generating content related to the target play in response to a determination of the occurrence of the target play;
an editing process for generating edited content by editing the content based on user information related to a user who is a service recipient;
A program for causing an information processing device to execute the above.
PCT/JP2024/029079 2023-08-30 2024-08-15 Information processing device, information processing method, and program Pending WO2025047451A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023140071 2023-08-30
JP2023-140071 2023-08-30

Publications (1)

Publication Number Publication Date
WO2025047451A1 true WO2025047451A1 (en) 2025-03-06

Family

ID=94818926

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/029079 Pending WO2025047451A1 (en) 2023-08-30 2024-08-15 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2025047451A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007163568A (en) * 2005-12-09 2007-06-28 Nippon Telegr & Teleph Corp <Ntt> Digest scene information input device, input method, program for the method, and recording medium recording the program
JP2019501550A (en) * 2015-11-04 2019-01-17 ゴルフゾン カンパニー リミテッド Golf video information automatic generation system and golf video information automatic generation method
JP2020061729A (en) * 2018-10-04 2020-04-16 エヌシーソフト・コーポレイションNcsoft Corporation Method and apparatus for highlighting sports competition
JP2023070149A (en) * 2021-11-03 2023-05-18 狂點軟體開發股▲ふん▼有限公司 Location-based "metaverse" community system of real/virtual interaction constituted by combining real world and multiple virtual worlds

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007163568A (en) * 2005-12-09 2007-06-28 Nippon Telegr & Teleph Corp <Ntt> Digest scene information input device, input method, program for the method, and recording medium recording the program
JP2019501550A (en) * 2015-11-04 2019-01-17 ゴルフゾン カンパニー リミテッド Golf video information automatic generation system and golf video information automatic generation method
JP2020061729A (en) * 2018-10-04 2020-04-16 エヌシーソフト・コーポレイションNcsoft Corporation Method and apparatus for highlighting sports competition
JP2023070149A (en) * 2021-11-03 2023-05-18 狂點軟體開發股▲ふん▼有限公司 Location-based "metaverse" community system of real/virtual interaction constituted by combining real world and multiple virtual worlds

Similar Documents

Publication Publication Date Title
EP4636678A1 (en) Information processing device, information processing method, and program
EP4586180A1 (en) Information processing device, information processing method, and program
US20250209560A1 (en) Smart-venue wagering system and method for live events
JP2023054333A (en) Athletic training system and method
US9033781B2 (en) Designing a real sports companion match-play crowdsourcing electronic game
US8834303B2 (en) Arena baseball game system
US20120202594A1 (en) Simulated sports events utilizing authentic event information
KR101976354B1 (en) A system for recording and broadcasting sports games using a smart device
KR20140103033A (en) Augmented reality for live events
US8973083B2 (en) Phantom gaming in broadcast media system and method
JP6218139B2 (en) GAME MANAGEMENT DEVICE, GAME SYSTEM, AND PROGRAM
US11570511B2 (en) Composite video competition
JP6870143B1 (en) Support support system and support method
US9210473B2 (en) Phantom gaming in a broadcast media, system and method
US9399170B2 (en) Systems, methods, and computer program products for objective fantasy sporting contests
JP6140846B2 (en) Mini soccer game support system that combines online and offline
JP7010444B1 (en) Play recording video creation system
US20120142421A1 (en) Device for interactive entertainment
US20020107059A1 (en) Method of proceeding with a game while incorporating other elements into the actual game
WO2025047451A1 (en) Information processing device, information processing method, and program
WO2025047450A1 (en) Information processing device, information processing method, and program
US20020092027A1 (en) Automatic image retrieval system
JP2002101400A (en) Method for fetching other element to actual game and conducting game
WO2007013151A1 (en) Play log creating device and program for creating play log
WO2021183697A1 (en) Electronic video sports game

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24859487

Country of ref document: EP

Kind code of ref document: A1