WO2020075594A1 - Dispositif de traitement de l'information et procédé de lecture de fichiers de contenu - Google Patents
Dispositif de traitement de l'information et procédé de lecture de fichiers de contenu Download PDFInfo
- Publication number
- WO2020075594A1 WO2020075594A1 PCT/JP2019/038958 JP2019038958W WO2020075594A1 WO 2020075594 A1 WO2020075594 A1 WO 2020075594A1 JP 2019038958 W JP2019038958 W JP 2019038958W WO 2020075594 A1 WO2020075594 A1 WO 2020075594A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- event
- content file
- information
- unit
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
- A63F13/49—Saving the game status; Pausing or ending the game
- A63F13/497—Partially or entirely replaying previous game actions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/86—Watching games played by other players
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
Definitions
- the present disclosure relates to a technique of generating a content file and / or a technique of reproducing the content file.
- Patent Document 1 discloses an information processing device that allows a user to select the type of sharing processing of game image data.
- This information processing apparatus presents a user with a plurality of choice GUIs regarding sharing of image and sound data.
- One option GUI specifies a process of uploading the played game image / sound data recorded in the auxiliary storage device to the moving image distribution server.
- Another option GUI specifies a process of live-relaying the game image / sound data being played via the moving image distribution server.
- Patent Document 2 discloses an information processing device for sharing feedback of user contents.
- This information processing apparatus includes an activity acquisition unit that acquires an activity of a first user in a first scene of content, and a scene extraction that extracts a second scene similar to the first scene from the same content or another content. And a feedback diversion unit that diverts feedback generated in the first scene by the activity of the first user in the second scene provided to the second user.
- a large amount of content is uploaded to the image sharing site, so viewers cannot easily determine which content is interesting.
- the distribution user who uploads the image to the image sharing site wants many viewers to view the content, but the content is buried in a large number of contents and is not easily viewed. Therefore, it is desired to realize a mechanism that automatically associates information of interest to the viewer with the content so that the viewer can instantly understand the highlights of the content.
- the present disclosure aims to provide a technique of generating event information included in a content file and / or a technique of presenting a point of interest of the content to a viewing user.
- an information processing apparatus represents a reproducing unit that reproduces a content file, a time length of the content file with a line length from a start point to an end point, and reproduces the content file.
- the position is represented by the position of the pointer with respect to the line, and the user can move the pointer position along the line to specify the playback position of the content file.
- An indicator processing unit that displays a playback position indicator and event information included in the content file And a storage unit for storing an index file including the. The indicator processing unit displays the event mark representing the occurrence of the event along the line based on the index file.
- Another aspect of the present disclosure is a method of reproducing a content file, which includes a step of reproducing the content file, a time length of the content file is expressed by a line length from a start point to an end point, and a reproduction position of the content file is expressed by a line. It is expressed by the position of the pointer with respect to, and the step of displaying the playback position indicator that allows the user to specify the playback position of the content file by moving the pointer position along the line, and the index file containing the event information included in the content file Based on the line, displaying an event mark representing the occurrence of the event.
- the information processing system 1 includes an information processing apparatus 10 operated by a user A who is a distributor and information processing apparatuses 12b and 12c operated by users B and C who are viewing users (hereinafter, referred to as “information processing unless otherwise distinguished. Device 12)) and one or more servers, which are connected via a network 3 such as the Internet.
- the one or more servers include the management server 5, the first server 13, and the second server 16.
- the management server 5 and the first server 13 are managed by one operating entity that provides network services, while the second server 16 is managed by another operating entity, for example, an entity whose business is moving picture distribution. You may
- the information processing device 12 of the viewing user may be the same device as the information processing device 10 of the distribution user.
- the configuration around the information processing device 12 is the same as the configuration around the information processing device 10. Therefore, the configuration around the information processing device 10 will be representatively described below.
- the access point (hereinafter referred to as “AP”) 8 has the functions of a wireless access point and a router, and the information processing device 10 connects to the AP 8 wirelessly or via a wire to manage the management server 5 on the network 3.
- the first server 13 and the second server 16 are communicably connected.
- the input device 6 operated by the user A is connected to the information processing device 10 in a wireless or wired manner and outputs operation information by the user to the information processing device 10.
- the information processing device 10 receives the operation information from the input device 6, it is reflected in the processing of the system software or the application software, and the output device 4 outputs the processing result.
- the application software is game software
- the information processing device 10 is a game device that executes the game software
- the input device 6 is a device such as a game controller that supplies operation information of the user to the information processing device 10.
- the input device 6, which is a game controller is configured to include a plurality of push-type operation buttons, a plurality of input units such as an analog stick that can input an analog amount, and a rotary button.
- the auxiliary storage device 2 is a large-capacity storage device such as an HDD (hard disk drive) or a flash memory, and may be an external storage device connected to the information processing device 10 by a USB (Universal Serial Bus) or the like, or a built-in storage device. May be
- the output device 4 may be a television having a display that outputs images and a speaker that outputs audio.
- the output device 4 may be connected to the information processing device 10 by a wired cable or may be wirelessly connected.
- the camera 7 is a stereo camera and takes a picture of the space around the output device 4.
- FIG. 1 shows an example in which the camera 7 is attached to the upper part of the output device 4, it may be arranged on the side of the output device 4, and in any case, the game is played in front of the output device 4.
- the user A is placed at a position where the user A can be photographed.
- the information processing device 10 uploads the game sound file (content file) played by the user A to the moving image distribution server 17 that is the second server 16.
- the viewing users, users B and C can access the moving image distribution server 17 from the information processing device 12 to view the content file uploaded by the user A.
- the information processing system 1 functions as a content file distribution system.
- the management server 5 provides a network service for the information processing device 10 and the information processing device 12.
- the management server 5 manages a network account for identifying the user, and the user uses the network account to sign in to the network service.
- each user can register game save data and virtual prizes (trophy) obtained during the game play in the management server 5, and the game play in the first server 13.
- You can send event information that occurred in the inside.
- the video distribution server 17, which is the second server 16 in the embodiment, is operated by an operator other than the provider of the network service, but the viewing user is uploaded to the video distribution server 17 via the network service. You can watch the content file.
- the viewing user has an advantage that the index file generated by the first server 13 can be used to view the content file provided from the video distribution server 17. .
- the information processing device 10 detects the activity of the user A who is viewing the content and transmits it as event information to the first server 13.
- the fact that the user A who is playing the game is watching the content is synonymous with the fact that he is watching the game image and the game sound he is playing.
- the activity of the user A may be, for example, utterance during the game play, a large body movement, or a large movement of the input device 6.
- the activity automatically detected by the information processing apparatus 10 is set to an action that makes the user A guess that he / she was excited during the game play.
- the first server 13 collects the activity of the user A as event information during the game play.
- the first server 13 generates an index file that summarizes the event information collected during the game play.
- the information processing device 10 may send event information, which is estimated not only to the activity of the user A, to the excitement of the game play, to the first server 13.
- the event to be transmitted may be a game event set for the game.
- FIG. 2 shows an external configuration of the upper surface of the input device.
- the user grips the left grip 78b with the left hand and the right grip 78a with the right hand to operate the input device 6.
- a direction key 71, analog sticks 77a and 77b, and four types of operation buttons 76 are provided on the upper surface of the housing of the input device 6.
- the ⁇ button 72 is a red circle
- the ⁇ button 73 is a blue cross
- the ⁇ button 74 is a square. Is a purple square
- the triangle button 75 is a green triangle.
- a function button 80 is provided between the two analog sticks 77a and 77b.
- the function button 80 is used to turn on the power of the input device 6 and simultaneously activate the communication function for connecting the input device 6 and the information processing device 10.
- the function button 80 is pressed while the main power of the information processing device 10 is off, the information processing device 10 sends a connection request transmitted from the input device 6 for turning on the main power. The instruction is also accepted, and the main power of the information processing device 10 is turned on.
- the function button 80 is also used to cause the information processing device 10 to display the home screen.
- a touch pad 79 is provided in a flat area between the direction key 71 and the operation button 76.
- the touch pad 79 also functions as a push-down button that sinks downward when the user presses it and returns to the original position when the user releases the hand.
- the SHARE button 81 is provided between the touch pad 79 and the direction key 71.
- the SHARE button 81 is used to input an instruction from the user to the OS or system software in the information processing device 10.
- the OPTIONS button 82 is provided between the touch pad 79 and the operation button 76.
- the OPTIONS button 82 is used to input an instruction from the user to an application (game) executed in the information processing device 10.
- the input device 6 has a motion sensor including a triaxial angular velocity sensor and a triaxial acceleration sensor.
- the input device 6 transmits the sensor data and operation information of various input units to the information processing device 10 in a predetermined cycle.
- FIG. 3 shows a hardware configuration of the information processing device 10.
- the information processing device 10 includes a main power button 20, a power ON LED 21, a standby LED 22, a system controller 24, a clock 26, a device controller 30, a media drive 32, a USB module 34, a flash memory 36, a wireless communication module 38, and wired communication. It has a module 40, a subsystem 50, and a main system 60.
- the main system 60 includes a main CPU (Central Processing Unit), a memory and a memory controller that are main storage devices, and a GPU (Graphics Processing Unit).
- the GPU is mainly used for arithmetic processing of game programs. These functions may be configured as a system on chip and formed on one chip.
- the main CPU has a function of starting the OS and executing the application installed in the auxiliary storage device 2 under the environment provided by the OS.
- the subsystem 50 includes a sub CPU, a memory that is a main storage device, a memory controller, and the like, and does not include a GPU.
- the sub CPU operates even while the main CPU is in the standby state, and its processing function is limited in order to keep the power consumption low.
- the sub CPU and the memory may be formed on separate chips.
- the main power button 20 is an input unit through which a user inputs an operation, is provided on the front surface of the housing of the information processing device 10, and turns on or off the power supply to the main system 60 of the information processing device 10. Be operated for.
- the main power supply being in the ON state means that the main system 60 is in the active state
- the main power supply being in the OFF state means that the main system 60 is in the standby state.
- the power-on LED 21 lights up when the main power button 20 is turned on, and the standby LED 22 lights up when the main power button 20 is turned off.
- the system controller 24 detects pressing of the main power button 20 by the user. When the main power button 20 is pressed while the main power is in the off state, the system controller 24 acquires the pressing operation as an "on instruction", while the main power is on when the main power is in the on state. When the button 20 is pressed, the system controller 24 acquires the pressing operation as an “off instruction”.
- the main CPU has a function of executing the game program installed in the auxiliary storage device 2 or the ROM medium 44, while the sub CPU does not have such a function.
- the sub CPU has a function of accessing the auxiliary storage device 2 and a function of transmitting / receiving data to / from the management server 5.
- the sub CPU is configured to have only such a limited processing function, and thus can operate with less power consumption than the main CPU.
- the functions of these sub CPUs are executed when the main CPU is in the standby state. Since the subsystem 50 is operating when the main system 60 is in the standby state, the information processing apparatus 10 according to the embodiment maintains the state of being always signed in to the network service provided by the management server 5.
- the clock 26 is a real-time clock that generates current date and time information and supplies it to the system controller 24, the subsystem 50, and the main system 60.
- the device controller 30 is configured as an LSI (Large-Scale Integrated Circuit) that transfers information between devices like a south bridge. As illustrated, devices such as the system controller 24, the media drive 32, the USB module 34, the flash memory 36, the wireless communication module 38, the wired communication module 40, the subsystem 50, and the main system 60 are connected to the device controller 30. It The device controller 30 absorbs the difference in the electrical characteristics of each device and the difference in the data transfer rate, and controls the timing of data transfer.
- LSI Large-Scale Integrated Circuit
- the media drive 32 is a drive device in which a ROM medium 44 recording application software such as a game and license information is mounted and driven, and programs and data are read from the ROM medium 44.
- the ROM medium 44 is a read-only recording medium such as an optical disc, a magneto-optical disc, or a Blu-ray disc.
- the USB module 34 is a module that connects to an external device with a USB cable.
- the USB module 34 may be connected to the auxiliary storage device 2 and the camera 7 with a USB cable.
- the flash memory 36 is an auxiliary storage device that constitutes an internal storage.
- the wireless communication module 38 wirelessly communicates with the input device 6 using a communication protocol such as a Bluetooth (registered trademark) protocol or an IEEE 802.11 protocol.
- the wired communication module 40 performs wired communication with an external device and connects to the network 3 via the AP 8, for example.
- FIG. 4 shows functional blocks of the information processing device 10 that operates as a distribution device that uploads a content file to the video distribution server 17.
- the information processing device 10 includes a processing unit 100, a communication unit 102, and a reception unit 104.
- the processing unit 100 includes a game execution unit 110, a sound providing unit 112, an image processing unit 114, a recording processing unit 116, an activity acquisition unit 120, an activity detection unit 130, an event information generation unit 140, and an upload processing unit 142.
- each element described as a functional block that performs various processes can be configured by a circuit block, a memory, or another LSI in terms of hardware, and in terms of software, system software or memory can be used. It is realized by the game program loaded in. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by only hardware, only software, or a combination thereof, and the present invention is not limited to them.
- the communication unit 102 receives the operation information that the user has operated the input unit of the input device 6, and also transmits the event information generated by the processing unit 100 to the first server 13 and the content file to the second server 16.
- the communication unit 102 is expressed as a configuration having the functions of the wireless communication module 38 and the wired communication module 40 shown in FIG.
- the reception unit 104 is provided between the communication unit 102 and the processing unit 100, and transmits data or information between the communication unit 102 and the processing unit 100. Upon receiving the operation information of the input unit provided in the input device 6 via the communication unit 102, the reception unit 104 supplies the operation information to the game execution unit 110 and / or the upload processing unit 142 of the processing unit 100. The reception unit 104 may supply the operation information of the input device 6 and the sensor data to the first motion acquisition unit 122.
- the game execution unit 110 executes game software (hereinafter, also simply referred to as “game”) to generate image data and sound data of the game.
- game game software
- the function shown as the game execution unit 110 is realized by system software, game software, hardware such as GPU, and the like.
- the game is an example of an application, and the game execution unit 110 may execute an application other than the game.
- the game execution unit 110 performs a calculation process of moving the player character in the virtual space based on the operation information input by the user A to the input device 6.
- the game execution unit 110 includes a GPU (Graphics Processing Unit) that executes a rendering process, etc., and receives a calculation processing result in the virtual space to generate game image data from a viewpoint position (virtual camera) in the virtual space.
- the game execution unit 110 also generates game sound data in the virtual space.
- the image processing unit 114 provides the play image data to the output device 4, and the audio providing unit 112 provides the play audio data to the output device 4, so that the output device 4 outputs the play image and the play audio.
- the user A operates the input device 6 to play the game while viewing the content (that is, the play image and the play sound) output from the output device 4.
- the recording processing unit 116 records the play image data provided by the image processing unit 114 and the play audio data provided by the audio providing unit 112 in association with the recording time.
- the recording time may be an absolute time.
- the play image data and the play audio data are collectively referred to as “content data”.
- the recording processing unit 116 generates a content file in which the recorded content data is associated with the recording time at a predetermined timing, for example, the timing when the play of the user A ends.
- the recording processing unit 116 may generate the content file in a predetermined compression format.
- the recording processing unit 116 records the content data in the auxiliary storage device 2 together with the recording time based on the explicit instruction from the user A or automatically.
- a ring buffer is used to record the content data, and when the recorded content data exceeds a predetermined capacity or a predetermined time, old data may be sequentially overwritten with new data.
- the activity acquisition unit 120 acquires the activity of the user A who is viewing the content.
- the activity acquired by the activity acquisition unit 120 is set to an action that allows the user A to infer that the user A is excited, such as speaking during the game play.
- the activity acquisition unit 120 includes a first motion acquisition unit 122, a second motion acquisition unit 124, and a voice acquisition unit 126.
- the activity detection unit 130 detects whether or not the activity acquisition unit 120 has acquired the activity of the user.
- acquisition of an activity means occurrence of an event. Since the activity is set to the action that can infer the excitement of the user A, for example, when the user A is playing the game in a still state, the activity detection unit 130 causes the activity acquisition unit 120 to acquire the activity. Does not detect.
- the activity detection unit 130 has a first motion detection unit 132, a second motion detection unit 134, and a voice detection unit 136.
- the first motion acquisition unit 122 periodically acquires the data of the motion sensor of the input device 6 from the reception unit 104.
- the data of the motion sensor is data obtained by detecting the motion of the input device 6.
- the first motion detection unit 132 calculates the amount of movement of the input device 6 from the acquired sensor data, and detects whether the amount of movement of the input device 6 exceeds a predetermined threshold value.
- the calculated motion amount may include the acceleration, moving speed, moving amount, etc. of the input device 6.
- the amount of movement is an index for evaluating whether or not the input device 6 is violently or greatly moved. If the amount of movement exceeds a predetermined threshold, the first motion detection unit 132 causes the input device 6 to be violently or largely moved. Detect that it is being moved. When the first motion detection unit 132 detects that the amount of movement of the input device 6 exceeds a predetermined threshold value, the first motion detection unit 132 stores the occurrence time of the first motion event. When detecting that the amount of movement of the input device 6 has fallen below a predetermined threshold, the first motion detection unit 132 stores the end time of the first motion event. After the end of the first motion event, the first motion detection unit 132 supplies the occurrence time and end time of the first motion event to the event information generation unit 140.
- the second motion acquisition unit 124 periodically acquires captured image data of the camera 7.
- the second motion acquisition unit 124 identifies the movement of the user A from the captured image data and analyzes the movement of the user A by image analysis.
- the second motion detection unit 134 calculates the amount of movement from the analyzed movement of the user A, and detects whether the amount of movement of the user A exceeds a predetermined threshold value.
- the calculated amount of movement may include the acceleration, moving speed, moving amount, etc. of a specific part of the user A.
- the specific part may be the head
- the second motion acquisition unit 124 analyzes the movement of the head of the user A
- the second motion detection unit 134 calculates the amount of movement of the head.
- the amount of movement of the head is an index for evaluating whether or not the user A is moving violently or significantly, and if the amount of movement exceeds a predetermined threshold, the second motion detection unit 134 causes the head to be vigorous or large. Detect that it is moving.
- the second motion detection unit 134 stores the occurrence time of the second motion event.
- the second motion detection unit 134 When detecting that the amount of movement of the user's head has fallen below a predetermined threshold, the second motion detection unit 134 stores the end time of the second motion event. After the end of the second motion event, the second motion detection unit 134 supplies the occurrence time and end time of the second motion event to the event information generation unit 140.
- the voice acquisition unit 126 acquires the voice of the user A input to the microphone 9.
- the voice acquisition unit 126 acquires the user voice data.
- the audio data acquired by the audio acquisition unit 126 is supplied to the recording processing unit 116. Therefore, the recording processing unit 116 records the voice data uttered by the user A together with the content data. At this time, the recording processing unit 116 records the content data and the user voice data acquired by the voice acquisition unit 126 in the auxiliary storage device 2 in association with the recording time. Then, the recording processing unit 116 generates a content file in which the content data and the user voice data are associated with the recording time at a predetermined timing. After the game play ends, the user A can upload the generated content file to the moving image distribution server 17.
- the voice detection unit 136 detects whether or not the user voice is acquired by the voice acquisition unit 126.
- the voice detection unit 136 may be a module that has a VAD (Voice Activeity Detection) function.
- VAD Voice Activeity Detection
- the voice acquisition unit 126 may cancel the game voice output from the output device 4 with an echo canceller to obtain the user voice data. Thereby, the voice detection unit 136 can efficiently detect the presence or absence of the user voice.
- the voice detection unit 136 When the voice detection unit 126 detects that the voice of the user is acquired, the voice detection unit 136 stores the time of occurrence of the voice event. When the voice detecting unit 126 detects that the voice of the user is no longer acquired, the voice detecting unit 136 stores the end time of the voice event. After the end of the voice event, the voice detection unit 136 supplies the occurrence time and the end time of the voice event to the event information generation unit 140.
- the audio detection unit 136 may detect the audio volume during the audio event.
- the audio detecting unit 136 may supply the audio volume information to the event information generating unit 140 together with the occurrence time and the ending time of the audio event.
- the recording processing unit 116 may record the chat voice data of the friend together with the content data.
- the voice detection unit 136 may detect whether or not the chat voice of the friend is present, and supply the occurrence time and the end time of the chat voice event to the event information generation unit 140.
- character data may be acquired by recognizing the detected voice.
- the voice recognition process may be performed by the information processing device 10 or may be performed by an external server.
- the voice detection unit 136 supplies the event information generation unit 140 with the text data that is the voice recognition result in addition to the occurrence time and the end time of the voice event.
- the activity detection unit 130 may detect activities other than the above.
- the first motion acquisition unit 122 may acquire the operation information of the input unit in addition to the sensor data of the input device 6 to acquire the movement of the player character during the game.
- the first motion acquisition unit 122 may detect the activity when the first motion acquisition unit 122 acquires the combination of a plurality of key operations.
- the first motion detection unit 132 may detect the activity when the input unit of the input device 6 is operated frequently.
- the first motion acquisition unit 122 acquires the operation information of the input unit, and the first motion detection unit 132 detects that the number of button operations or the number of analog stick operations per unit time exceeds a predetermined number. , May detect activity.
- the event information generation unit 140 generates event information including the acquisition start time and acquisition end time of the user activity detected by the activity detection unit 130.
- the event information includes at least information indicating the type of activity (event type information).
- the voice event information may include a voice volume and a voice recognition result.
- the game maker may set a game event according to the play situation during the game. For example, it is an event in which a player character plays against a boss, or an event in which a goal is determined in a soccer game.
- a game event occurs during the game, a predetermined sound effect or image is played to make the game lively.
- the event information generation unit 140 of the embodiment is notified of the occurrence and end of the game event from the game execution unit 110, and generates event information associated with the occurrence time and the end time.
- the game execution unit 110 may notify the event information generation unit 140 of the generation time and the end time of the vibration signal while outputting the vibration signal for driving the vibration element of the input device 6.
- FIG. 5 shows an example of event information generated by the event information generation unit 140.
- the event information generation unit 140 generates event information based on the detection result output from the activity detection unit 130 and the game event notified from the game execution unit 110.
- recording of the content data by the recording processing unit 116 is started from time 14:23:08.
- the event information generation unit 140 generates event information with at least a combination of event type information and event time information.
- the event type information of the first motion event is defined as E_ID1
- the event type information of the second motion event is defined as E_ID2
- the event type information of the audio event is defined as E_ID3
- the event type information of the game event is defined as E_ID4.
- the first motion detection unit 132 notifies the event information generation unit 140 that the first motion event has occurred between time t1 and time t5 at time t5. Upon receiving this notification, the event information generation unit 140 generates event information in which the event type information (E_ID1) and the event time information (time t1 to time t5) are combined. In addition, the first motion detection unit 132 notifies the event information generation unit 140 that the first motion event has occurred at time t10 between time t9 and time t10. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of event type information (E_ID1) and event time information (time t9 to time t10).
- E_ID1 event type information
- time t9 time information
- the second motion detection unit 134 notifies the event information generation unit 140 that the second motion event has occurred between time t3 and time t4 at time t4. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of event type information (E_ID2) and event time information (time t3 to time t4). In addition, the second motion detection unit 134 notifies the event information generation unit 140 that the second motion event has occurred between time t12 and time t13 at time t13. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of the event type information (E_ID2) and the event time information (time t12 to time t13).
- E_ID2 event type information
- time t13 time information
- the voice detection unit 136 notifies the event information generation unit 140 that a voice event has occurred between time t3 and time t6 at time t6. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of event type information (E_ID3) and event time information (time t3 to time t6). Further, the voice detection unit 136 notifies the event information generation unit 140 that the voice event has occurred at the time t11 from the time t8 to the time t11. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of event type information (E_ID3) and event time information (time t8 to time t11). Note that, regarding the voice event, if the voice volume information and the text data as the voice recognition result are notified from the voice detection unit 136, the event information generation unit 140 also includes these information and data in the event information.
- the game execution unit 110 notifies the event information generation unit 140 that a game event has occurred at time t1 and that the game event has ended at time t7. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of the event type information (E_ID4) and the event time information (time t1 to time t7). The game execution unit 110 also notifies the event information generation unit 140 that a game event has occurred at time t12 and that the game event has ended at time t14. Upon receiving this notification, the event information generation unit 140 generates event information that is a combination of event type information (E_ID4) and event time information (time t12 to time t14).
- the event information generation unit 140 supplies the event information to the communication unit 102 each time the event information is generated, and the communication unit 102 transmits the event information to the collection server 14. That is, the communication unit 102 transmits the event information to the collection server 14 while the user A is viewing the content. At this time, the event information may be provided with information (content ID) for identifying the content data.
- the collection server 14 stores the event information transmitted from the communication unit 102 in association with the information (device ID) that identifies the information processing apparatus 10.
- FIG. 6 shows the functional blocks of the collection server 14.
- the collection server 14 includes a communication unit 202, a processing unit 210, and an event information storage unit 220.
- the processing unit 210 has an event information acquisition unit 212 and an event information provision unit 214.
- the communication unit 202 receives the event information from the information processing device 10, and the event information acquisition unit 212 acquires the event information.
- the event information acquisition unit 212 stores the acquired event information in the event information storage unit 220 in association with the device ID of the information processing device 10 together with the content ID.
- the communication unit 102 sequentially transmits the generated event information while the recording processing unit 116 is recording the content data.
- the event information acquisition unit 212 acquires a large number of event information regarding the content.
- the event information acquired by the event information acquisition unit 212 is used as a material for transmitting the excitement point of the content to the viewing user.
- the recording processing unit 116 when the user A finishes the game play, the recording of the content data and the user voice data by the recording processing unit 116 is finished.
- the recording processing unit 116 generates a content file in which the recorded content data and user voice data are associated with the recording time.
- the recording processing unit 116 may create the content file in a predetermined compression format. Even if the user A finishes the game play, the functions of the activity acquisition unit 120, the activity detection unit 130, and the event information generation unit 140 may be continuously executed. If the game play is completed, the second motion acquisition unit 124 in the activity acquisition unit 120, the voice acquisition unit 126, the second motion detection unit 134 in the activity detection unit 130, and the voice detection unit 136 remain active,
- the event information generator 140 may generate event information.
- the image data sharing process is started when the user A operates the SHARE button 81 provided on the input device 6, and the upload processing unit 142 generates an input image indicating options regarding image data sharing.
- FIG. 7 shows an example of an input screen showing options for sharing processing.
- the upload processing unit 142 generates an input image indicating options for sharing processing, and causes the image processing unit 114 to display the input image on the output device 4.
- three options regarding image sharing are shown.
- “Upload a video clip” is a GUI for designating uploading of the content file recorded in the auxiliary storage device 2 to the moving image distribution server 17, and “Upload screenshot” is a moving image of a screenshot.
- the user A operates the input device 6 to move the selection frame 200, selects one of the GUIs, and presses the enter button to execute the selected image sharing process.
- the "Upload video clip" GUI is selected. After that, the user selects the content file to be uploaded to start the upload process.
- the upload processing unit 142 instructs the communication unit 102 to transmit the selected content file. As a result, the communication unit 102 uploads the content file to the video distribution server 17.
- the communication unit 102 transmits the time information indicating the start time and end time of the content file uploaded to the video distribution server 17 to the collection server 14 together with the device ID.
- the communication unit 102 may also transmit a content ID that identifies the content of the content file. In this case, the communication unit 102 transmits the device ID, the content ID, and the time information indicating the start time and the end time of the content file to the collection server 14.
- the event information providing unit 214 stores, from the event information storage unit 220, the event information associated with the device ID and the content ID and included between the start time and the end time of the content file. Extract.
- the event information providing unit 214 provides the extracted event information to the index generation server 15.
- FIG. 8 shows the functional blocks of the index generation server 15.
- the index generation server 15 includes a communication unit 232, a processing unit 240, and an index file storage unit 250.
- the processing unit 240 has an event information acquisition unit 242, an index generation unit 244, and an index provision unit 246.
- the communication unit 232 receives the event information from the collection server 14, and the event information acquisition unit 2142 acquires the event information regarding the content file uploaded to the video distribution server 17.
- the event information acquisition unit 242 passes the acquired event information to the index generation unit 244.
- the index generation unit 244 generates an index file including the event information included in the content file uploaded to the video distribution server 17.
- the collection server 14 of the embodiment has a role of collecting event information from the information processing apparatus 10 (or the information processing apparatus 12) at all times, not only during the user's game play.
- the index generation server 15 is in charge of generating the index file in the first server 13, but the collection server 14 may generate the index file. That is, the first server 13 may be composed of a single server.
- the first server 13 and the second server 16 are operated by different operating entities, but they may be operated by the same operating entity. If both servers are operated by the same operating entity, when the information processing device 10 uploads the content file to the second server 16, the information processing device 10 displays time information indicating the start time and end time of the uploaded content file. Need not be transmitted to the first server 13, and the second server 16 may transmit time information indicating the start time and end time of the content file to the first server 13.
- the content file is uploaded to the moving image distribution server 17, and the index generation server 15 generates the index file related to the content file.
- the viewing user can access the video distribution server 17 from the information processing device 12 to view the content file.
- FIG. 9 shows a functional block of the information processing device 12 that downloads and reproduces a content file from the video distribution server 17.
- the information processing device 12 includes a processing unit 100, a communication unit 102, and a reception unit 104.
- the processing unit 100 includes a voice providing unit 112, an image processing unit 114, a download processing unit 150, a reproduction unit 152, and an indicator processing unit 154.
- the hardware configuration of the information processing device 12 is the same as the hardware configuration of the information processing device 10 shown in FIG. 3, and FIG. 9 shows the functions used by the viewing user. Therefore, although not shown, the information processing device 12 has the functional blocks of the information processing device 10 shown in FIG. 4, and the information processing device 10 has the functional blocks of the information processing device 12 shown in FIG. 9.
- each element described as a functional block that performs various processes can be configured by a circuit block, a memory, or another LSI in terms of hardware, and in terms of software, system software or memory can be used. It is realized by the game program loaded in. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by only hardware, only software, or a combination thereof, and the present invention is not limited to them.
- the download processing unit 150 accesses the video distribution server 17 via the network service provided by the management server 5. By passing through the network service, the management server 5 can sort the captured images of the content stored in the moving image distribution server 17 and provide the sorted information to the information processing device 12 as a list screen.
- the image processing unit 114 displays on the output device 11 a selection screen for the content file uploaded to the video distribution server 17.
- FIG. 10 shows a list screen of uploaded contents.
- the management server 5 manages the content information uploaded in the video distribution server 17, and the download processing unit 150 acquires the content information from the management server 5 and generates a list screen.
- the content information may include a captured image of the game, a game title, and information identifying a distributor.
- the management server 5 may generate the content list screen by referring to the index file generated by the index generation server 15. For example, by referring to the index file, the contents may be arranged in descending order of the utterances of the distributor. Further, the content is not limited to utterance, and the content may be arranged in the order of increasing or decreasing events included in the content. Further, when the recognition result of the user voice is included in the index file, the management server 5 may extract the contents including a predetermined word and arrange them. At this time, the management server 5 may extract the content by the viewing user inputting a predetermined word as a search word.
- the viewing user operates the input device 6 to move the selection frame 300 and arrange the content information to be viewed.
- the selection frame 300 is arranged in the content information of the game title “ABC soccer” uploaded by the user A.
- a download request designating the content file is transmitted to the video distribution server 17.
- the video distribution server 17 distributes the specified content file to the information processing device 12 of the viewing user.
- the management server 5 instructs the index generation server 15 to transmit the index file corresponding to the content file to the information processing device 12.
- the index providing unit 246 in the index generation server 15 reads the index file corresponding to the content file of the user A's “ABC soccer” from the index file storage unit 250, and via the communication unit 232, the index file of the viewing user is displayed. The information is transmitted to the information processing device 12.
- the download processing unit 150 downloads the content file from the video distribution server 17 and stores it in the auxiliary storage device 2. Further, the download processing unit 150 downloads the index file from the index generation server 15 and stores it in the auxiliary storage device 2.
- the content file may be distributed by streaming from the moving picture distribution server 17.
- the reproducing unit 152 reproduces the content file.
- the sound providing unit 112 outputs the reproduced sound from the output device 11, and the image processing unit 114 outputs the reproduced game image from the output device 11.
- the reproduced sound includes not only the game sound but also the sound uttered by the user.
- the image processing unit 114 secures an area on the screen of the output device 11 for displaying a playback position indicator for notifying a viewing user of the playback position.
- the indicator processing unit 154 displays a reproduction position indicator in the secured area.
- FIG. 11 shows an example of a game image reproduction screen.
- the image processing unit 114 displays the reproduced game image in the main display area, and displays the reproduction position indicator 310 indicating the reproduction position in the indicator display area 302.
- the reproduction position indicator 310 is displayed as a seek bar.
- the pointer 306 is moved along the line 304, and has a role of relatively indicating the reproduction position of the content file.
- the indicator processing unit 154 represents the time length of the content file by the line length from the start point to the end point, and also represents the reproduction position of the content file by the position of the pointer 306 with respect to the line 304.
- the reproduction position indicator 310 is a GUI that can be operated by the user, and the user can specify the reproduction position of the content file by moving the position of the pointer 306 along the line 304.
- the indicator processing unit 154 refers to the event information included in the index file and displays an event mark 308 representing the occurrence of the event along the line 304.
- the indicator processing unit 154 determines the position of the event mark 308 according to the time information included in the event information. Since the display area of the line 304 is finite, in order to display the event mark 308 in the time direction, it is necessary to divide the event mark 308 into predetermined time units. This time may be set depending on the length of the content file. For example, if the content length is 60 minutes, the time unit may be set to about 5 seconds. In this case, the event mark 308 is displayed every 5 seconds. The event mark 308 is also used as a jump point for reproduction.
- the indicator processing unit 154 treats the above-mentioned time unit (5 seconds) as one time zone, and when the same time zone includes a plurality of event occurrence periods, a plurality of event marks 308 are added to the line. Display them side by side in a predetermined direction.
- FIG. 12 shows the details of the event marks 308a that are overlapped.
- the indicator processing unit 154 displays a plurality of event marks 308b, 308c, 308d, and 308e side by side in a predetermined direction with respect to the line 304 when a plurality of event occurrence periods are included in the same time period.
- the event mark 308 is stacked on the line 304 in a direction perpendicular to the line 304.
- the indicator processing unit 154 sets the display mode of the event mark 308 according to the type of event.
- the display mode may be, for example, a pattern or a color. In any case, it is preferable that the viewing user can recognize which type of event has occurred by the display mode of the event mark 308.
- all the event marks 308b, 308c, 308d, and 308e have different display modes. Therefore, the viewing user can see that four types of events occur simultaneously during this time period. recognize.
- the heights of the event marks 308 in the stacking direction are set equal, but weights may be set for the events and the heights of the event marks 308 may be set to be different. Even for the same event, the height of the event mark 308 may be set to be different depending on the content of the event.
- volume information is added to an audio event, and thus a high event mark 308 may be assigned to a high volume audio event and a low event mark 308 may be assigned to a low volume audio event. Further, as a result of the voice recognition process, a high event mark 308 may be assigned to a voice event that includes a word that the viewing user is interested in.
- the viewing user can simply check the accumulated height of the event mark 308 to check the excitement point.
- the cursor is placed on the event mark 308 of the voice event, the text data uttered by the user may be displayed.
- the viewing user can imagine the event that occurred during this time period by viewing the user's uttered text, and can immediately view the content image by moving the pointer 306 to that position.
- FIG. 13 shows another example of the game image reproduction screen.
- the image processing unit 114 displays the reproduced game image in the main display area, and displays the reproduction position indicator 310 indicating the reproduction position in the indicator display area 322 provided in a part of the main display area.
- the playback position indicator 330 is displayed as seeking.
- FIG. 14 is an enlarged view of the reproduction position indicator 330.
- the pointer 326 is moved along the circular line 324 and has a role of relatively indicating the reproduction position of the content file.
- the indicator processing unit 154 represents the time length of the content file by the line length from the start point to the end point, and also represents the reproduction position of the content file by the position of the pointer 326 with respect to the line 324.
- the reproduction position indicator 330 is a GUI that can be operated by the user, and the user can specify the reproduction position of the content file by moving the position of the pointer 326 along the line 324.
- the indicator processing unit 154 represents the reproduction position of the content by the position of the pointer 326 on the circular line 324, and sets the time advancing direction to the clockwise direction.
- the indicator processing unit 154 refers to the event information included in the index file and displays the event mark 308 representing the occurrence of the event along the line 324.
- the indicator processing unit 154 determines the position of the event mark 308 according to the time information included in the event information.
- the arrangement method of the event mark 308 is the same as that of the reproduction position indicator 310.
- the indicator processing unit 154 may display the text data included in the event information existing in the corresponding direction of the reproduction position indicator 330 by tilting the analog stick 77.
- FIG. 15 shows a display example of text. Since the indicator processing unit 154 displays the text in association with the event mark 308, the viewing user can easily recognize the event that has occurred.
- the indicator processing unit 154 may display an icon indicating the meaning expressed by the text in association with the event mark 308 instead of displaying the text. For example, the skull icon may indicate negative utterance content, and the heart icon may indicate positive utterance content.
- the indicator processing unit 154 may provide a UI for searching the text included in the event information.
- FIG. 16 shows a text search box.
- the indicator processing unit 154 may search the text included in the event information and display it in association with the event mark 308.
- an enlarged display function of the reproduction position indicator 330 may be provided.
- the line 324 to which the analog stick is tilted may be enlarged and displayed.
- the enlargement ratio may be changed by a predetermined button operation.
- the present disclosure has been described based on the embodiments. It should be understood by those skilled in the art that this embodiment is an exemplification, and that various modifications can be made to the combinations of the respective constituent elements and the respective processing processes, and that such modifications are also within the scope of the present disclosure.
- the game is shown as the content in the embodiment, the content may include a moving image other than the game.
- the first server 13 and the second server 16 are described as being managed by different operating entities in the embodiments, both servers may be managed by a single operating entity.
- the present disclosure can be used in the field of generating and / or reproducing content files.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Dans ce dispositif de traitement d'informations selon la présente invention, une unité de lecture (152) lit un fichier de contenu. Une unité de traitement des indicateurs (154) représente la longueur temporelle du fichier de contenu en tant que longueur de ligne d'un point de départ à un point d'arrivée, représente la position de lecture du fichier de contenu en tant que position du pointeur par rapport à la ligne, et affiche un indicateur de position de lecture qui permet à un utilisateur d'effectuer une opération pour spécifier la position de lecture du fichier de contenu en déplaçant la position du pointeur le long de la ligne. Un dispositif de stockage auxiliaire (2) stocke un fichier d'index comprenant des informations d'événement incluses dans le fichier de contenu. L'unité de traitement d'indicateur (154) utilise le fichier d'index en tant que base pour afficher un repère d'événement représentant l'occurrence d'un événement le long de la ligne.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-190877 | 2018-10-09 | ||
| JP2018190877A JP2020061629A (ja) | 2018-10-09 | 2018-10-09 | 情報処理装置およびコンテンツファイルの再生方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020075594A1 true WO2020075594A1 (fr) | 2020-04-16 |
Family
ID=70163804
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/038958 Ceased WO2020075594A1 (fr) | 2018-10-09 | 2019-10-02 | Dispositif de traitement de l'information et procédé de lecture de fichiers de contenu |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2020061629A (fr) |
| WO (1) | WO2020075594A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022097604A1 (fr) * | 2020-11-06 | 2022-05-12 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif de traitement d'informations, et procédé de prise en charge de création de rapport |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022079789A1 (fr) * | 2020-10-13 | 2022-04-21 | 任天堂株式会社 | Système, programme et procédé de traitement d'informations |
| KR102569358B1 (ko) * | 2021-06-11 | 2023-08-22 | (주)크래프톤 | 게임 결과 요약 제공 방법 및 이를 적용한 디바이스 |
| JP2023104017A (ja) * | 2022-01-17 | 2023-07-28 | シャープ株式会社 | 情報処理装置及び情報処理システム |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001292402A (ja) * | 2000-04-07 | 2001-10-19 | Sony Corp | デジタルビデオ再生方法、デジタルビデオ再生装置及びデジタルビデオ録画再生装置 |
| JP2012008877A (ja) * | 2010-06-25 | 2012-01-12 | Konami Digital Entertainment Co Ltd | 操作受付装置、操作判別方法、および、プログラム |
-
2018
- 2018-10-09 JP JP2018190877A patent/JP2020061629A/ja active Pending
-
2019
- 2019-10-02 WO PCT/JP2019/038958 patent/WO2020075594A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001292402A (ja) * | 2000-04-07 | 2001-10-19 | Sony Corp | デジタルビデオ再生方法、デジタルビデオ再生装置及びデジタルビデオ録画再生装置 |
| JP2012008877A (ja) * | 2010-06-25 | 2012-01-12 | Konami Digital Entertainment Co Ltd | 操作受付装置、操作判別方法、および、プログラム |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022097604A1 (fr) * | 2020-11-06 | 2022-05-12 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif de traitement d'informations, et procédé de prise en charge de création de rapport |
| JP2022075304A (ja) * | 2020-11-06 | 2022-05-18 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および報告作成支援方法 |
| JP7431143B2 (ja) | 2020-11-06 | 2024-02-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および報告作成支援方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2020061629A (ja) | 2020-04-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7461174B2 (ja) | 共有インターフェースを介してアクセスされるミニゲーム | |
| US11565187B2 (en) | Method for sharing a portion of gameplay of a video game | |
| JP6665224B2 (ja) | 情報処理装置およびユーザをアプリケーションに参加させる方法 | |
| US10532290B2 (en) | Sharing recorded gameplay to a social graph | |
| CN103886009B (zh) | 基于所记录的游戏玩法自动产生为云游戏建议的小游戏 | |
| WO2020075594A1 (fr) | Dispositif de traitement de l'information et procédé de lecture de fichiers de contenu | |
| CN113852767B (zh) | 视频编辑方法、装置、设备及介质 | |
| US20170105029A1 (en) | Information processing device, information processing system, content image generating method, and content data generating method | |
| JP7078743B2 (ja) | 情報処理システム、情報処理装置およびコンテンツファイルの生成方法 | |
| JP7162730B2 (ja) | 情報処理装置 | |
| JP7091289B2 (ja) | 情報処理装置および入力装置 | |
| JP7374257B2 (ja) | 処理方法およびプログラム | |
| JP6779190B2 (ja) | ゲームコントローラ | |
| WO2017090096A1 (fr) | Système de commande de production créative | |
| JP2020010394A (ja) | 情報処理装置および検索結果取得方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19870097 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19870097 Country of ref document: EP Kind code of ref document: A1 |