[go: up one dir, main page]

WO2014162757A1 - Appareil de traitement d'informations, procédé d'étiquetage et programme - Google Patents

Appareil de traitement d'informations, procédé d'étiquetage et programme Download PDF

Info

Publication number
WO2014162757A1
WO2014162757A1 PCT/JP2014/050829 JP2014050829W WO2014162757A1 WO 2014162757 A1 WO2014162757 A1 WO 2014162757A1 JP 2014050829 W JP2014050829 W JP 2014050829W WO 2014162757 A1 WO2014162757 A1 WO 2014162757A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
position information
information
processing apparatus
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/050829
Other languages
English (en)
Japanese (ja)
Inventor
淳己 大村
淳也 小野
誠司 鈴木
健太郎 木村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2014162757A1 publication Critical patent/WO2014162757A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4524Management of client data or end-user data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Definitions

  • the present disclosure relates to an information processing apparatus, a tagging method, and a program.
  • Patent Document 1 metadata including information on event occurrence time, event type, content, and the like of each scene included in a moving image that is content is acquired, and the metadata is transferred to the corresponding scene in the moving image as an event.
  • a technique for assigning as a time tag is disclosed.
  • Patent Document 2 collects a plurality of comments posted to a moving image via a network such as the Internet, extracts a scene of interest from the moving image based on the number of comments posted, and includes the comment.
  • a technique for tagging a comment keyword to the attention scene is disclosed.
  • the present disclosure proposes a new and improved information processing apparatus, tagging method, and program capable of tagging content with a higher degree of freedom.
  • a position information acquisition unit that acquires position information of an operating tool associated with an elapsed time during content reproduction, and a tag for the content by adding the position information to the content as tag information.
  • an information processing apparatus including a tagging unit that performs tagging.
  • tagging of the content is obtained by acquiring position information of the operating tool associated with the elapsed time during content playback, and adding the position information to the content as tag information. And a tagging method is provided.
  • the position information acquisition unit acquires position information of the operating tool associated with the elapsed time during content reproduction, and the tagging unit assigns the position information as tag information to the content. Is tagged. As described above, since the user can tag the content by the operation of moving the operating tool, the tagging reflecting the user's preference is realized by a simpler operation.
  • FIG. 1 is a functional block diagram illustrating a schematic configuration of an information processing apparatus according to an embodiment of the present disclosure. It is explanatory drawing for demonstrating an example of the tagging process which concerns on this embodiment. It is explanatory drawing for demonstrating the tagging process by which the reproduction speed of a content is controlled which is a modification of the tagging process in this embodiment.
  • FIG. 10 is an explanatory diagram for explaining the association between the position information of the operating tool in the X-axis direction and the content playback speed in the modification of the tagging process shown in FIG. 3.
  • FIG. 10 is an explanatory diagram for explaining the association between the position information of the operating tool in the X-axis direction and the content playback speed in the modification of the tagging process shown in FIG. 3.
  • FIG. 10 is an explanatory diagram for explaining the association between the position information of the operating tool in the X-axis direction and the content playback speed in the modification of the tagging process shown in FIG. 3.
  • FIG. 10 is an explanatory diagram for explaining the association between the position information of the operating tool in the X-axis direction and the content playback speed in the modification of the tagging process shown in FIG. 3. It is explanatory drawing for demonstrating the tagging process which is a modification of the tagging process in this embodiment, and the fast-forward or rewind of content is controlled. It is explanatory drawing for demonstrating the production
  • the position information of the operating tool associated with the elapsed time during content playback is acquired by detecting the position of the operating tool while controlling the playback state of the content. Then, tagging of the content is performed by giving the position information as tag information to the content.
  • tagging process the above-described series of processes in which tag information is acquired and the tag information is assigned to content is referred to as tagging process.
  • FIG. 1 is a functional block diagram illustrating a schematic configuration of an information processing apparatus according to an embodiment of the present disclosure.
  • the information processing apparatus 10 includes an input unit 110, a display unit 120, a storage unit 130, and a control unit 140.
  • the input unit 110 is an input interface for allowing a user to input information and commands related to various processing operations to the information processing apparatus 10.
  • the input unit 110 has a function of detecting the position of the operating tool and inputting the position information to the information processing apparatus 10.
  • the input unit 110 includes a sensor device for detecting the position of the operating body. The user can input the position information of the operating tool to the information processing apparatus 10 by moving the operating tool within the detection range of the sensor device of the input unit 110.
  • the sensor device included in the input unit 110 may be a device that detects the position of the operation body on a plane such as a touch pad, and position information on a two-dimensional plane is input as position information of the operation body. Also good.
  • the sensor device included in the input unit 110 may be a device that detects the position of the operating tool in a space such as a stereo camera or an infrared camera, and position information in a three-dimensional space as the position information of the operating tool. May be input.
  • the display unit 120 is an output interface that visually displays various types of information processed in the information processing apparatus 10 and processed results on a display screen.
  • the display unit 120 displays the contents of various contents (for example, moving images and still images) on the display screen under the control of the control unit 140. Further, the display unit 120 may display the locus of the position information of the operating tool input from the input unit 110 on the display screen.
  • the storage unit 130 is an example of a storage medium for storing various types of information processed by the information processing apparatus 10 and processed results.
  • the storage unit 130 stores content data processed by the information processing apparatus 10.
  • the storage unit 130 stores content data to which tag information is added, which is generated as a result of the tagging process performed by the control unit 140.
  • FIG. 2 is an explanatory diagram for explaining an example of a tagging process according to the present embodiment.
  • the input unit 110 includes a sensor device that detects the position of the operating body on a plane such as a touch pad, and is integrated with the display screen 210 of the display unit 120. That is, the input unit 110 and the display unit 120 constitute a so-called touch panel.
  • FIG. 2 illustrates a case where the operation body is a user's finger as an example of the operation body.
  • one scene of the moving image (image data included in the moving image data) is displayed on the display screen 210 as an example of the content.
  • an indicator 220 indicating the elapsed time (reproduction position) during reproduction of the moving image is displayed on the display screen 210 at the same time.
  • the horizontal direction is referred to as the X-axis direction and the vertical direction is referred to as the Y-axis direction based on the image displayed on the display screen 210.
  • a point 240 representing the contact point is displayed on the display screen 210.
  • the two-dimensional position information of the point 240 on the display screen 210 is input to the information processing apparatus 10.
  • a locus 250 of position information may be displayed on the display screen 210 as shown in FIG.
  • the position information of the operation tool is acquired as the coordinate value (X, Y) on the display screen 210.
  • the position information of the operating tool in the first direction is used as tag information, and the operating tool in a second direction different from the first direction is used.
  • the position information may be used for content playback state control.
  • the first direction may be the Y-axis direction, that is, the vertical direction with respect to the display screen, and the X-axis direction, ie, the display screen display.
  • the left-right direction may be the second direction.
  • the playback position of the content or the playback speed of the content may be controlled according to the position information in the X-axis direction among the position information of the operating tool.
  • the reproduction state is controlled by the position information in one of them, for example, the X-axis direction, while the other, for example, Position information in the Y-axis direction is acquired. Therefore, the position information in the Y-axis direction is acquired as the position information of the operating tool associated with the elapsed time during content reproduction, and is used as tag information.
  • the input unit 110 and the display unit 120 configure a touch panel
  • the operation body is a finger
  • the content data is moving image data.
  • the present embodiment will be described.
  • this embodiment is not limited to this example.
  • the input unit 110 may have any configuration as long as the position of the operating body can be detected.
  • the operating body may be a mouse pointer operated by a mouse. The position of the mouse pointer on the display screen of the display unit 120 may be detected.
  • the input unit 110 may include a sensor device that detects the position of the operating tool in space, and the position of the user's hand may be detected as the operating tool.
  • the content data may not be moving image data, and for example, any content data such as music data and slide show data in which still images are continuously displayed at a predetermined time may be applied.
  • the control unit 140 controls the information processing apparatus 10 in an integrated manner, and performs various types of information processing in the tagging process according to the present embodiment.
  • the function and configuration of the control unit 140 will be described in more detail.
  • the control unit 140 includes a position information acquisition unit 141, a playback state control unit 142, a display control unit 143, and a tagging unit 144.
  • the position information acquisition unit 141 acquires the position information of the operating tool that is detected by the input unit 110 and is associated with the elapsed time during content playback.
  • the position information acquired by the position information acquisition unit 141 is the position information of the operating tool on the display screen of the display unit 120, and the position information is acquired as, for example, two-dimensional coordinates on the display screen. May be.
  • the position information acquisition unit 141 may acquire the coordinate value (X, Y) corresponding to the point 240 that is a contact point of the operating tool with respect to the display screen 210 as the position information.
  • the position information acquisition unit 141 transmits the acquired position information of the operating tool to the reproduction state control unit 142, the display control unit 143, and the tagging unit 144.
  • the playback state control unit 142 controls the playback state of content in the information processing apparatus 10.
  • content playback state control means control of various operations related to content playback. For example, (normal) playback, stop, pause, fast forward, rewind, high speed playback, slow playback, and repeat of content. This includes control of various operations such as reproduction.
  • the content playback state control includes control for playing back content from an arbitrary playback position, control for extracting and playing back a part of the content, and the like.
  • the position information of the operating tool in the first direction is used as tag information for the position information of the operating tool acquired by the position information acquiring unit 141, and the first direction and The position information of the operating body in a different second direction is used for content playback state control.
  • the playback state control unit 142 may control the playback state of the content according to the position information of the operating tool in the X-axis direction of the display screen 210.
  • the playback state control unit 142 associates the position information of the operating tool in the X-axis direction with the elapsed time during content playback, and the content at the playback position corresponding to the position information of the operating tool in the X-axis direction. Can be played. That is, the X-axis coordinate value on the display screen 210 corresponds to the elapsed time during content reproduction, and the position of the operation body in the X-axis direction on the display screen 210 changes to seek the content reproduction position. The When the X-axis coordinate value corresponds to the elapsed time during content playback, the time in the content elapses from the left to the right of the display screen 210, that is, as the X-axis coordinate value increases.
  • Both can be associated with each other. By performing such association, it is possible to seek the reproduction position that is more suitable for the user's intuition for moving the operating tool.
  • the indicator 220 indicating the content playback position may also change according to the position information of the operating tool in the X-axis direction. .
  • the playback state control unit 142 controls the playback position of the content according to the position information of the operating tool in the X-axis direction has been described, but the present embodiment is not limited to such an example.
  • the playback state control unit 142 may perform other playback control on the content in accordance with the position information of the operating tool in the X-axis direction.
  • the playback state control unit 142 edits the content after the tagging process based on the tag information given to the content, and controls the playback of the edited content. May be.
  • content editing processing using such tag information the following ⁇ 3. Specific example of content editing using tag information> will be described in detail.
  • the playback state control unit 142 transmits information related to the playback control of content performed by the playback state control unit 142 to the display control unit 143.
  • the display control unit 143 controls the driving of the display unit 120 and visually displays various types of information processed in the information processing apparatus 10 on the display screen of the display unit 120 in all formats such as text, tables, graphs, and images. To display.
  • the display control unit 143 displays the content content on the display screen of the display unit 120.
  • the display control unit 143 displays an image included in the moving image that is the content on the display screen in accordance with the content reproduction state control by the reproduction state control unit 142.
  • the display control unit 143 displays a point corresponding to the position information of the operating tool acquired by the position information acquisition unit 141 on the display screen of the display unit 120. For example, in the example illustrated in FIG.
  • the display control unit 143 displays the point 240 on the display screen 210 at a position corresponding to the position information of the operating tool. Further, as shown in FIG. 2, the display control unit 143 may display the locus 250 of the position information of the operating tool on the display screen 210.
  • the tagging unit 144 tags the content by adding the location information acquired by the location information acquisition unit 141 to the content as tag information.
  • the tagging unit 144 uses the position information of the operating tool in the first direction among the position information of the operating tool acquired by the position information acquiring unit 141 as tag information. .
  • the tagging unit 144 uses position information of the operating tool in the Y-axis direction as tag information. More specifically, the tagging unit 144 may digitize the position information of the operating tool in the Y-axis direction and use the numerical value (for example, the coordinate value of the Y-axis) as tag information.
  • the position information of the operating tool is, for example, coordinate values (X, Y) on the display screen 210.
  • the coordinate value of the X axis in the position information of the operating tool corresponds to, for example, the playback position of the content, that is, the elapsed time during playback of the content. Yes. Accordingly, it can be said that the position information of the operating tool acquired by the position information acquisition unit 141 has a value (Y-axis coordinate value) associated with the elapsed time during the reproduction of the content. Therefore, the tagging unit 144 can tag content by using the position information of the operating tool as tag information.
  • FIG. 2 is an explanatory diagram for explaining an example of a tagging process according to the present embodiment.
  • the tagging process described with reference to FIG. 2 is an example of the tagging process according to the present embodiment, and other tagging processes may be performed in the present embodiment. Details of such other tagging processes in the present embodiment are described in ⁇ 2. This will be described in detail again in “Modification of Tagging Process>.
  • the user when performing the tagging process, the user inputs position information by bringing a finger 230 into contact with the display screen 210.
  • the X axis corresponds to the elapsed time (reproduction position) during content reproduction.
  • the value of the Y axis may be an index indicating, for example, the “preference level” of the user for the content. For example, while the finger 230 is in contact with the display screen 210, the user moves the finger 230 from the left end to the right end of the display screen 210 to seek the playback position of the content, and in the desired scene, moves the finger 230 upward.
  • the finger 230 is moved downward (in the direction in which the Y-axis coordinate value decreases) in a scene that does not feel so attractive.
  • the position information acquired by the position information acquisition unit 141 corresponds to the elapsed time during the reproduction of the content, that is, represents the user's preference for each scene of the content.
  • the tagging unit 144 can add a tag representing the user's preference level for each scene of the content to the content by adding the position information to the content as the tag information.
  • inputting position information serving as tag information is also referred to as inputting tag information.
  • the display control unit 143 displays an image of the scene corresponding to the position information of the finger 230 in the X-axis direction on the display screen 210.
  • the playback state control unit 142 displays the content in the scene corresponding to the X-axis coordinate value of the position.
  • the reproduction state may be controlled such that the reproduction of the content is paused and the reproduction of the content is continued when the position information is input again by the user.
  • the position information may not necessarily be input continuously, and may be interrupted in the middle. In this way, it is possible to input position information while pausing playback of a video as necessary and referring to a thumbnail of the video, so that the user's intention is reflected more in each scene of the video. The degree can be input.
  • the position information acquisition unit 141 acquires the position information of the operating tool for a portion corresponding to an arbitrary time range in the content, and the tagging unit 144 corresponds to the time range in which the position information is acquired. Tag information may be added to the content for the portion.
  • input of position information may be started from an arbitrary point on the X axis, and input of position information may be ended at an arbitrary point.
  • the display screen 210 is divided into a tag information input area and a reproduction position seek area, and the reproduction state control unit 142 and tagging unit 144 tag only the position information acquired in the tag information input area.
  • the position information used in the information and acquired in the playback position seek area may not be used as tag information but may be used to seek the playback position of the content.
  • the playback position seek area may be an area where the indicator 220 is displayed, and the tag information input area is from the area where the indicator 220 is displayed. May be the upper region.
  • the user seeks the playback position of the content to a desired position by moving the operating tool on the indicator 220, and then inputs the tag information by moving the operating tool in the tag information input area. .
  • the X axis is set according to the length of the content reproduction time.
  • the resolution will be different. For example, for a movie with a playback time of 10 minutes and a movie with a playback time of 100 minutes, the playback time of content associated with the same distance on the X-axis is 10 times different. Therefore, when the content playback time is relatively long, the progress (seek amount) of the elapsed time during content playback with respect to the movement distance of the operating tool in the X-axis direction increases, and fine position information is input for each scene. Can be difficult.
  • the playback state control unit 142 associates one end to the other end in the X-axis direction on the display screen 210 with a portion corresponding to an arbitrary time range in the content, and The content may be reproduced at a reproduction position corresponding to the position information of the operating body in the X-axis direction. That is, the time range of the content to be sought while the operating body moves from the left end to the right end of the display screen 210 may be arbitrarily set. For example, if the content is a video with a playback time of 100 minutes, and if the left end to the right end of the display screen 210 is assigned to 10 minutes, the playback state control unit 142 plays the content by dividing it into 10 parts.
  • the position information may be input while moving the operating body from the left end to the right end of the display screen 210.
  • the position information may be input while moving the operating body from the left end to the right end of the display screen 210.
  • tag information may be overwritten. That is, the tagging unit 144 may tag content based on the latest position information of the operating tool. Further, when the tag information is overwritten, the position information does not need to be reacquired for the entire time range of the content, and only the portion corresponding to the arbitrary time range in the content is positioned by the position information acquisition unit 141. Information may be reacquired, and only the tag information of the portion corresponding to the time range may be overwritten by the tagging unit 144. Accordingly, for example, the position information is input with a small resolution in association with the entire time range (total playback time) of the content from the left end to the right end of the display screen 210 for the first time.
  • the time range corresponding to the part is matched with the left end from the left end of the display screen 210, and position information is input again in a state with a large resolution.
  • the position information corresponding to the time range of interest is input again again. This enables efficient tagging processing.
  • the content reproduction state may not be controlled by the position information of the finger 230 in the X-axis direction.
  • the content is played back at a predetermined speed and the content is displayed on the display screen 210, and the playback position of the content played back and displayed on the display screen 210;
  • the tagging process may be performed by associating the input position information of the finger 230 in the Y-axis direction.
  • the user does not need to pay attention to the positional information of the finger 230 in the X-axis direction, and the finger 230 can be viewed in the Y-axis direction while watching content that is played back at a normal speed.
  • the position information can be input by moving to. Therefore, the user can input position information as if he / she attaches a tag to a scene he / she likes while enjoying the contents, and tagging processing more convenient for the user can be performed.
  • the position information acquisition unit 141 acquires the position information of the operating tool associated with the elapsed time during content reproduction, and the tagging unit 144 uses the position information as the tag information.
  • the tagging unit 144 uses the position information as the tag information.
  • tag information can be added to the content by inputting the position information of the operating tool by the user, for example, by moving a finger on the display screen of the touch panel, so that tagging processing with a higher degree of freedom is possible.
  • the position information of the operating body in the first direction is used as tag information by the tagging unit 144 among the acquired position information of the operating body, and is different from the first direction.
  • the playback state of the content is controlled by the playback state control unit 142 in accordance with the position information of the operating body in the direction of the direction. Therefore, the user can input the tag information while controlling the playback state of the content, for example, while seeking the playback position of the content.
  • tag information may be input only with respect to a part of content, and may be overwritten.
  • the resolution of the position information of the operating body in the second direction assigned to seek the playback position of the content may be changed. Therefore, the user seeks the playback position of the content up to an arbitrary position and then inputs tag information only for an arbitrary part, or changes the resolution and inputs tag information multiple times. It becomes possible to input tag information.
  • the reproduction state control unit 142 associates the position information of the operating tool in the X-axis direction of the display screen 210 with the elapsed time during content reproduction.
  • reproduction control for reproducing content at a reproduction position corresponding to the position information of the operating tool in the X-axis direction has been performed.
  • the playback state control unit 142 may perform other playback controls on the content in accordance with the position information of the operating tool in the X-axis direction.
  • the playback state control unit 142 may change the playback speed of the content based on the position information of the operating tool in the X-axis direction.
  • the playback state control unit 142 may perform fast forward or rewind of the content based on the position information of the operating tool in the X axis direction.
  • FIG. 3 is an explanatory diagram for describing a tagging process in which the playback speed of content is controlled, which is a modification of the tagging process in the present embodiment.
  • the display screen 210, the indicator 220, the finger 230, the point 240, and the locus 250 in FIG. 3 and FIG. 5 described later are the same as those shown in FIG.
  • the position information of the operating body in the X-axis direction of the display screen 210 and the content playback speed are associated with each other, and the playback state control unit 142 displays the operating body in the X-axis direction. Based on the position information, the playback speed of the content is changed.
  • the position information of the operating tool and the content playback speed in the X-axis direction are the normal playback speed at the left end of the display screen 210 in the X-axis direction, and as it goes toward the right end. That is, the reproduction speed is increased as the X-axis value increases.
  • the reproduction state control unit 142 can reproduce the content at a corresponding reproduction speed based on the position information of the point 240 in the X-axis direction.
  • the position information in the Y-axis direction may be an index indicating the “preference level” of the user for the content.
  • the user adjusts the playback speed of the content according to the position of the finger 230 in the X-axis direction, and moves the finger 230 upward (Y-axis coordinates in a favorite scene).
  • the finger 230 is moved downward (in the direction in which the Y-axis coordinate value decreases) in a scene that is not very attractive.
  • the user's preference operation for each scene corresponding to the elapsed time during the reproduction of the content by performing the input operation of the position information on the entire time range of the content or the portion corresponding to the arbitrary time range. Is acquired by the position information acquisition unit 141, and the tagging unit 144 adds tag information to the content using the position information as tag information.
  • the display control unit 143 controls the playback state control unit 142 to control the playback speed of the content at a speed corresponding to the playback speed.
  • the display control unit 143 controls the playback state control unit 142 to control the playback speed of the content at a speed corresponding to the playback speed.
  • FIGS. 4A to 4C are explanatory diagrams for explaining the association between the position information of the operating tool in the X-axis direction and the content reproduction speed in the modification of the tagging process shown in FIG.
  • the horizontal axis represents the X-axis coordinate value on the display screen 210
  • the vertical axis represents the content playback speed. Therefore, the curves shown in FIGS. 4A to 4C show the relationship between the position information of the operating tool in the X-axis direction and the content playback speed.
  • the X-axis coordinate value and the content playback speed are proportional to each other as the X-axis coordinate value increases and the content playback speed also increases at the same rate. Relationship may be. Further, as shown by a curve B in FIG. 4B, the X-axis coordinate value and the content playback speed gradually increase until a certain value on the X-axis, and then increase rapidly. The relationship shown by the downwardly convex curve may be sufficient. Further, as indicated by a curve C in FIG. 4C, the X-axis coordinate value and the content playback speed rapidly increase until a certain value on the X-axis, and then gradually increase. The relationship shown by a convex curve may be sufficient.
  • various correspondence relationships as shown in FIGS. 4A to 4C may be used for associating the position information of the operating tool in the X-axis direction and the content playback speed in this modification.
  • the association between the position information of the operating tool in the X-axis direction and the content playback speed in the present modification is not limited to the relationship shown in FIGS. 4A to 4C, and any other correspondence relationship may be used.
  • the user may be able to input an arbitrary relationship regarding the relationship between the X-axis coordinate value and the content playback speed by moving the operating tool on the display screen 210.
  • the position information of the operating tool in the X-axis direction and the content playback speed are associated with each other.
  • the user can input the position information of the operating tool while controlling the playback speed of the content. Therefore, for example, while referring to the thumbnail of the moving image displayed on the display screen 210, the user increases the playback speed and moves the finger 230 downward in a scene that does not feel much attractive (that is, as a preference level). It is possible to input a fine degree of preference by slowing down the playback speed and moving the finger 230 up and down in a favorite scene.
  • tagging processing that is more convenient for the user is realized.
  • FIG. 5 is an explanatory diagram for explaining a tagging process in which fast-forwarding or rewinding of content is controlled, which is a modification of the tagging process in the present embodiment.
  • the position information of the operating tool in the X-axis direction of the display screen 210 is associated with the fast-forward and rewind control of the content.
  • the content may be fast-forwarded or rewound based on the position information of the operating body.
  • the playback state control unit 142 uses the approximate midpoint in the X-axis direction of the display screen 210 as a reference point, and the content when the position information of the operating tool is acquired on the right side of the reference point. And when the position information of the operating tool is acquired on the left side of the reference point, control is performed to rewind the content.
  • the fast-forward or rewind speed may be controlled according to the position information of the operating body in the X-axis direction. For example, in the example shown in FIG. 5, the fast-forward speed increases as the value of the X-axis in the position information of the operating body moves rightward on the display screen 210, and the rewinding speed increases faster toward the left side. May be.
  • the position information in the Y-axis direction may be an index indicating the “preference level” of the user for the content.
  • the user rewinds or fast-forwards the content according to the position of the finger 230 in the X-axis direction, and moves the finger 230 upward (in the Y-axis direction in a favorite scene).
  • the finger 230 is moved downward (in the direction in which the Y-axis coordinate value decreases) in a scene that is not very attractive.
  • the user's preference operation for each scene corresponding to the elapsed time during the reproduction of the content by performing the input operation of the position information on the entire time range of the content or the portion corresponding to the arbitrary time range.
  • Is acquired by the position information acquisition unit 141, and the tagging unit 144 adds tag information to the content using the position information as tag information.
  • the display control unit 143 controls the moving image corresponding to the fast-forward or rewind according to the fast-forward or rewind control of the content by the playback state control unit 142.
  • the image inside is displayed on the display screen 210. Therefore, the user can input his / her preference degree while referring to the image (moving image thumbnail) displayed on the display screen 210.
  • the position information of the operating tool in the X-axis direction is associated with the fast-forward and rewind control of the content
  • the user can input position information while fast-forwarding or rewinding the content.
  • the speed of fast forward or rewind can be controlled. Therefore, for example, while referring to the image data displayed on the display screen 210, the user moves the finger 230 downward while fast-forwarding the content in a scene that does not feel much attractive (that is, the preference is small). In a favorite scene, it is possible to return the playback speed to the normal speed and move the finger 230 up and down to input a fine degree of preference.
  • the content can be rewound to a desired reproduction position, and the position information can be re-input.
  • tagging processing that is more convenient for the user is realized.
  • the control of the content playback speed and the fast-forward or rewind operation of the content are performed. Accordingly, the user can input the tag information while changing the playback speed of the content to a desired speed and moving to a desired playback position by performing fast forward or rewind, which is more convenient for the user.
  • a highly tagging process is realized. Note that the tagging process according to the present embodiment is not limited to the one described above, and other tagging processes in which the position information of the operating tool in the X-axis direction is associated with other reproduction controls may be performed. .
  • the playback state control unit 142 can edit the content based on the tag information given to the content, and can control the playback of the edited content.
  • a specific example of content editing using tag information according to the present embodiment will be described in detail with reference to FIGS. 6, 7, 8, and 9.
  • the playback state control unit 142 can extract a part of the content based on the tag information.
  • FIG. 6 is an explanatory diagram for explaining a process of creating content with a predetermined playback time.
  • FIG. 7 is an explanatory diagram for explaining a smoothing process when a part of content is extracted.
  • the horizontal axis (x-axis) indicates the elapsed time during playback of the content
  • the vertical axis (y-axis) is on the Y axis of the position information display screen 210 input in the tagging process.
  • the coordinate value in is shown. Therefore, it can be said that the curve 310 shown in FIG. 6 is tag information in which the elapsed time during the reproduction of the content and the position information of the operating tool (position information in the Y-axis direction) are associated with each other. Therefore, in the following description, the curve 310 is also referred to as tag information 310. 8 and 9 to be described later also have the same meaning as the horizontal and vertical axes in FIG. 6, and in the following description, the curve 410 shown in FIGS.
  • 420 are also referred to as tag information 410, 420.
  • tag information 410, 420 In the following description of FIGS. 6, 7, 8, and 9, as an example of the position information of the operating tool, the value on the vertical axis is an index representing the user's preference.
  • FIG. 6 a state in which content having a predetermined playback time is created by extracting a range in which the value of the vertical axis is greater than or equal to a predetermined threshold with respect to the tag information 310 is shown. . Since the value on the vertical axis represents the user's preference level in each scene of the content, by performing such processing, only the portion where the user's preference level is high, that is, the portion in which the user is interested is displayed.
  • the extracted digest version of the moving image data can be created.
  • the reproduction state control unit 142 associates the position information of the operating tool in the Y-axis direction with the preference degree (score) assigned to the coordinate value of the Y-axis for the tag information, and the preference degree of the content A portion where is equal to or greater than a predetermined threshold can be extracted.
  • a threshold value for creating digest version moving image data a threshold value for 5 minute digest, a threshold value for 10 minute digest, and a threshold value for 20 minute digest Is schematically shown on the tag information 310.
  • a portion corresponding to a time range in which the value on the vertical axis is equal to or greater than the 5 minute digest threshold is extracted from the content.
  • portions corresponding to the time ranges from the time T 11 to T 12 and T 17 to the end of the moving image are extracted from the content, and these moving images are extracted from the content.
  • a time range in which the value on the vertical axis is equal to or greater than the 10 minute digest threshold, that is, the elapsed time during playback is T 2 .
  • the portions corresponding to the time ranges from T 3 , T 6 to T 7 , T 10 to T 13, and T 16 to the end of the moving image are extracted from the content, and these moving image data are joined together to produce a 10 minute digest version.
  • Video data is created.
  • a time range in which the value on the vertical axis is equal to or greater than the 20-minute digest threshold, that is, the elapsed time during playback is T 1 to T 4 , T 5 -T 8 , T 9 -T 14 and T 15 -Parts corresponding to the time range from the end of the video are extracted from the content, and these video data are joined together to create a 20-minute digest version. Movie data is created.
  • the playback state control unit 142 adjusts the threshold value for the preference level, and extracts a part of the content in order from the highest preference level so that the total playback time becomes a predetermined playback time. be able to.
  • portions corresponding to a plurality of non-consecutive time ranges are extracted from the content, and the moving image data are connected to each other, so that the digest version of the moving image data of a predetermined reproduction time is obtained. Is created. Therefore, when the extracted portions are joined together, the moving image may become discontinuous at the joint. In the present embodiment, for example, smoothing processing based on the content of a moving image can be performed for this phenomenon.
  • a curve 320 shown in FIG. 7 is obtained by extracting a part of the tag information 310 shown in FIG. Referring to FIG. 7, for example, when trying to create a digest version of a moving image, as described with reference to FIG. 6, an extraction range that is originally a portion corresponding to a time range equal to or greater than a threshold value. A portion of TA is extracted. However, when performing smoothing processing, playback state control unit 142 extracts a portion of the broad extraction range T B than the range shown in the extraction range T A, by joining with other extracted portions of the front and rear A digest version of the video may be created.
  • the extraction range T B may be determined based on image data and audio data included in the content data. For example, when the pixel information changes greatly in the image data included in the moving image data, the brightness, color, etc. in the screen change greatly, and the shooting direction of the camera changes or the scene changes within the moving image. There is a high possibility of being broken.
  • audio data when the audio input level (volume) changes significantly or the audio direction changes, there is a high possibility that a scene change is performed in the moving image. Accordingly, the playback state control unit 142 uses the extraction range T2 as a boundary by using a point where the amount of change in the pixel information in the image data, the audio input level and the audio direction in the audio data, and the like included in the content data is relatively large.
  • the content data when the tag information according to the present embodiment is different from the tags applied may be set extraction range T B based on such other tags.
  • the other tag is, for example, metadata including information about the event occurrence time, the event type, and the content of each scene included in the moving image that is the content set by the content provider.
  • the reproduction state control unit 142 based on the metadata, as the boundary a timing indicating a scene change, may set the extracted range T B of the content data.
  • the immediately preceding and immediately following the extraction range T B possibly scene change or the like is performed in the moving image is increased. Therefore, by connecting portions corresponding to these extraction ranges, discontinuity at the joints is reduced in the created digest version of the moving image, and a more natural moving image for the user is generated. Note that whether to use what information to set an extraction range T B may be configurable as appropriate by the user.
  • FIG. 8 is an explanatory diagram for describing a plurality of different tag information.
  • FIG. 9 is an explanatory diagram for explaining content creation processing for a predetermined playback time based on tag information for a plurality of different moving images.
  • the tag information 310 is the same as the tag information 310 shown in FIG. 6, and is created based on the position information of the operating tool input by the user.
  • the tag information 410 is, for example, tag information created based on the position information of the operating tool input by another user for the same moving image.
  • a plurality of tag information created by a plurality of different users may be shared between users.
  • a user can upload the tag information created by himself / herself for a certain video to a server existing on the cloud and make it available to other users. Moreover, the user can browse the tag information created by other users for the moving image uploaded to the server.
  • the range of users who can share tag information may be set arbitrarily.
  • the range of users who can share tag information may be a range of users belonging to the same SNS (Social Networking Service), or an arbitrary range of users set in the SNS (for example, so-called “ Range of users belonging to “My Friend”. In this way, by sharing tag information created by different users between users, it is possible to easily compare the degree of preference for the same moving image with the degree of preference of others.
  • SNS Social Networking Service
  • tag information by other users may be uploaded to a server on the cloud when the tagging process is completed, or the tag information during the tagging process is updated in real time while the server is updated. May be up.
  • Tag information while tagging is being performed can be viewed by multiple users, so that tagging can be performed while referring to tag information of other users, that is, referring to social tag information. Processing can be performed.
  • tag information by other users uploaded to the server on the cloud may be stored for a predetermined period, and a plurality of different tag information for the same video may be accumulated in the server as needed. Then, when browsing the tag information by other users uploaded to the server, the user displays the tag information for each registered period (daily, weekly, etc.), displays it in order of preference, etc. It may be possible to sort and display the items in order.
  • the content editing process by the playback state control unit 142 may be performed using a plurality of pieces of tag information that are different from each other.
  • the playback state control unit 142 performs the above [3-1.
  • a process of extracting a part of the content as described in “Creating content with a predetermined playback time” can be performed.
  • the playback state control unit 142 uses the tag information created by other users as a plurality of different tag information, thereby creating a digest version of the movie that reflects the social preference. be able to.
  • the creation of a digest version of a movie is given specific conditions such as tag information created during a desired period, tag information created by a desired user, among tag information created by other users. Social tag information may be used.
  • the playback state control unit 142 may use a simple sum of the degrees of preference in the tag information, an average value, or a median value. Etc. may be used.
  • FIG. 9 shows processing for creating content of a predetermined playback time based on tag information for a plurality of different moving images.
  • tag information 410 assigned to “Movie A” and tag information 420 assigned to “Movie B” are illustrated.
  • the tag information 410 and the tag information 420 are social tag information in which tag information created by a plurality of other users is integrated, for example.
  • the total extraction range for “Movie A” and “Movie B” is Threshold values are set in the tag information 410 and the tag information 420 so as to be 5 minutes, and a portion corresponding to a time range having a preference degree equal to or higher than the threshold value is extracted from each video, A 5-minute digest version of the video may be created.
  • Threshold values are set in the tag information 410 and the tag information 420 so as to be 5 minutes, and a portion corresponding to a time range having a preference degree equal to or higher than the threshold value is extracted from each video, A 5-minute digest version of the video may be created.
  • social tag information 410 and 420 are used as tag information.
  • the present embodiment is not limited to this example, and the same processing is performed based on tag information created by the user. Of course it is also possible to do.
  • the playback state control unit 142 extracts a part of the content based on the tag information, and creates a digest version of the movie for a predetermined time.
  • a threshold is provided for the score associated with the position information of the operating tool in the Y-axis direction, and the time range in which the score is equal to or greater than the threshold A portion corresponding to may be extracted.
  • the user's preference level is reflected in the position information of the operating tool in the Y-axis direction, it is possible to create a digest version of a moving image in which a part of the content is extracted in descending order of preference level.
  • the digest version of the moving image creation process may be performed based on a plurality of pieces of tag information created by a plurality of other users. Therefore, it is possible to edit and view content based on a social preference level that reflects the preferences of other users.
  • the content editing process using the tag information is performed, so that a variety of content viewing methods are provided to the user, and a user-friendly way of enjoying the content is realized.
  • the content editing process based on a plurality of tag information according to the present embodiment has been described.
  • the content editing process according to the present embodiment is not limited to such an example.
  • content editing processing may be performed using tag information according to the present embodiment and another tag different from the tag information according to the present embodiment.
  • the other tag may be, for example, metadata including information about an event occurrence time, an event type, content, and the like of each scene included in the moving image that is the content set by the content provider.
  • CM commercial
  • the tag information according to the present embodiment in addition to the tag information according to the present embodiment, it is possible to set a preference level using such other tags for content.
  • the tag information according to the present embodiment and the different method are set.
  • the content editing process may be performed using both the degree of preference. For example, by using the metadata as described above, it is possible to realize a content editing process more in line with the user's preference, such as cutting the CM portion or extracting only the appearance scene of the favorite performer.
  • FIG. 10 is a flowchart showing a processing procedure of the tagging method according to the present embodiment.
  • the processing procedure of the tagging method according to the present embodiment the case of performing the tagging process shown in FIG. 2 will be described as an example.
  • the functions of the storage unit 130, the position information acquisition unit 141, the playback state control unit 142, and the tagging unit 144 are described in ⁇ 1. Since it is described in “Configuration of Information Processing Device>, detailed description thereof is omitted.
  • the length (playback time) of a moving image to be displayed at a time on the display screen 210 in the tagging process is set (step S501). This is because ⁇ 1.
  • the reproduction state control unit 142 described in the configuration of the information processing apparatus corresponds to the process of associating one end to the other end in the X-axis direction on the display screen 210 with a portion corresponding to an arbitrary time range in the content. ing. In this way, a part of the content is associated from one end to the other end in the X-axis direction on the display screen 210, thereby enabling tagging processing with a high resolution.
  • the tagging start position is set (step S503). This is because ⁇ 1.
  • the position information acquisition unit 141 described in the section of the configuration of the information processing apparatus corresponds to the process of acquiring the position information of the operating tool for a portion corresponding to an arbitrary time range in the content. In this way, the user seeks the playback position of the content to a desired position by moving the operating body on the indicator 220 displayed in the playback position seek area of the display screen 210, and from the playback position.
  • Tag information can be input only in an arbitrary part.
  • a tagging process is performed (step S505). That is, when the operating tool is moved in the tag information input area on the display screen 210, the tag information is input and the tag information is given to the content. Specifically, the position information acquisition unit 141 acquires the position information of the operating tool associated with the elapsed time during content reproduction, and the tagging unit 144 assigns the position information to the content as tag information. The content is tagged.
  • step S505 When the tagging process in step S505 ends, the content data after the tagging process is stored in the storage unit 130 (step S507).
  • the end of the tagging process is, for example, that a predetermined time has elapsed since the operating tool is no longer detected (in the example shown in FIG. 2, the predetermined time elapses when the finger 230 moves away from the display screen 210. Or a dedicated operation for ending the tagging process such as a button press.
  • step S509 content editing processing based on the tag information is performed on the content that has been tagged.
  • the content editing process in step S509 is, for example, ⁇ 3.
  • Various editing processes described in the specific example of content editing process using tag information may be used.
  • the processing procedure of the tagging method according to the present embodiment has been described above with reference to FIG.
  • the content after the tagging process is stored in the storage unit 130, but the present embodiment is not limited to such an example.
  • the content after the tagging process may be stored in a server or the like on the cloud and shared among specific users.
  • FIG. 11 is a block diagram for describing a hardware configuration of the information processing apparatus 10 according to the embodiment of the present disclosure.
  • the information processing apparatus 10 mainly includes a CPU 901, a ROM 903, and a RAM 905.
  • the information processing apparatus 10 further includes a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a communication device 921, and a drive 923. And a connection port 925.
  • the CPU 901 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing apparatus 10 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 929.
  • the ROM 903 stores programs used by the CPU 901, calculation parameters, and the like.
  • the RAM 905 temporarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are connected to each other by a host bus 907 constituted by an internal bus such as a CPU bus.
  • the CPU 901, the ROM 903, and the RAM 905 correspond to, for example, the control unit 140 illustrated in FIG.
  • the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
  • PCI Peripheral Component Interconnect / Interface
  • the input device 915 is an operation means operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. Further, the input device 915 may be, for example, remote control means (so-called remote controller) using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA corresponding to the operation of the information processing device 10. It may be 931. Furthermore, the input device 915 includes an input control circuit that generates an input signal based on information input by a user using the above-described operation means and outputs the input signal to the CPU 901, for example. The user of the information processing apparatus 10 can input various data and instruct a processing operation to the information processing apparatus 10 by operating the input device 915. In the present embodiment, the input device 915 corresponds to, for example, the input unit 110 illustrated in FIG.
  • the output device 917 is a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as lamps, audio output devices such as speakers and headphones, printer devices, and the like.
  • the output device 917 outputs results obtained by various processes performed by the information processing apparatus 10. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 10 in various formats such as text, images, tables, and graphs. In the present embodiment, the display device corresponds to, for example, the display unit 120 illustrated in FIG.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
  • the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 10.
  • the storage device 919 corresponds to, for example, the storage unit 130 illustrated in FIG.
  • the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores various information processed in the tagging process according to the present embodiment, such as a program executed by the CPU 901 and various data.
  • the storage device 919 has various contents to be played back by the information processing apparatus 10, tag information obtained in the process of tagging processing according to the present embodiment, and content to which the tag information is attached (that is, tagging). Data such as later content) is stored.
  • the information processing apparatus 10 may further include the following components.
  • the communication device 921 is a communication interface configured by a communication device for connecting to a communication network (network) 927, for example.
  • the communication device 921 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 921 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
  • the communication device 921 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or other communication devices.
  • the network 927 connected to the communication device 921 is configured by a wired or wireless network, and may be, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
  • various types of content played back by the information processing apparatus 10 tag information obtained in the tagging process according to the present embodiment, data such as content after tagging, and the like are transmitted by the communication device 921. It may be received via the network 927 or transmitted from the information processing apparatus 10 to another external device (for example, a server on the cloud).
  • the drive 923 is a recording medium reader / writer, and is built in or externally attached to the information processing apparatus 10.
  • the drive 923 reads information recorded on a removable recording medium 929 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905.
  • the drive 923 can also write information to a removable recording medium 929 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the removable recording medium 929 is, for example, a DVD medium, an HD-DVD medium, a Blu-ray (registered trademark) medium, or the like.
  • the removable recording medium 929 may be a compact flash (registered trademark) (CompactFlash: CF), a flash memory, an SD memory card (Secure Digital memory card), or the like. Further, the removable recording medium 929 may be, for example, an IC card (Integrated Circuit card) on which a non-contact IC chip is mounted, an electronic device, or the like. In the present embodiment, various contents reproduced by the information processing apparatus 10, tag information obtained in the process of tagging processing according to the present embodiment, and data such as content after tagging are removed by the drive 923. It may be read from the recording medium 929 or written to the removable recording medium 929.
  • IC card Integrated Circuit card
  • the connection port 925 is a port for directly connecting a device to the information processing apparatus 10.
  • Examples of the connection port 925 include a USB (Universal Serial Bus) port, an IEEE 1394 port, and a SCSI (Small Computer System Interface) port.
  • As another example of the connection port 925 there are an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, and the like.
  • various contents reproduced by the information processing apparatus 10 tag information obtained in the course of the tagging process according to the present embodiment, and data such as content after tagging are connected to the connection port 925.
  • the external connection device 931 Via the external connection device 931, or may be output to the external connection device 931.
  • each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
  • a computer program for realizing each function of the information processing apparatus 10 according to the present embodiment as described above can be produced and installed in a personal computer or the like.
  • a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • the position information acquisition unit 141 acquires the position information of the operating tool associated with the elapsed time during content playback, and the tagging unit 144 assigns the position information to the content as tag information.
  • the content tagging process is performed. Therefore, tag information can be added to the content by inputting the position information of the operating tool by the user, for example, by moving a finger on the display screen of the touch panel, so that tagging processing with a higher degree of freedom is possible.
  • the position information of the operating body in the first direction is used as tag information by the tagging unit 144 among the acquired position information of the operating body, and is different from the first direction.
  • the playback state of the content is controlled by the playback state control unit 142 in accordance with the position information of the operating body in the direction of the direction. Therefore, the user can input the tag information while controlling the playback state of the content, for example, while seeking the playback position of the content.
  • tag information may be input only with respect to a part of content, and may be overwritten.
  • the resolution of the position information of the operating body in the second direction assigned to seek the playback position of the content may be changed. Therefore, the user seeks the playback position of the content up to an arbitrary position and then inputs tag information only for an arbitrary part, or changes the resolution and inputs tag information multiple times. It becomes possible to input tag information.
  • the position information of the operating tool in the second direction of the display screen 210 may correspond to other reproduction control.
  • the playback state control unit 142 may control the playback speed of the content and the fast-forward or rewind operation of the content based on the position information of the operating tool in the second direction. Accordingly, the user can input the tag information while changing the playback speed of the content to a desired speed and moving to a desired playback position by performing fast forward or rewind, which is more convenient for the user. A highly tagging process is realized.
  • the playback state control unit 142 can edit the content based on the tag information given to the content and control the playback of the edited content. For example, the playback state control unit 142 extracts a part of the content based on the tag information, and creates a digest version of the movie for a predetermined time. In the process of extracting a part of the content, a threshold is provided for the score associated with the position information of the operating tool in the first direction, and the time when the score is equal to or greater than the threshold. A portion corresponding to the range may be extracted.
  • the user's preference level is reflected in the position information of the operating tool in the first direction, it is possible to create a digest version of a moving image in which a part of the content is extracted in descending order of preference level.
  • the digest version of the moving image creation process may be performed based on a plurality of pieces of tag information created by a plurality of other users. Therefore, it is possible to edit and view content based on a social preference level that reflects the preferences of other users.
  • the content editing process using the tag information is performed, so that a variety of content viewing methods are provided to the user, and a user-friendly way of enjoying the content is realized.
  • the position information is two-dimensional position information on the display screen 210 of the touch panel, but the present embodiment is not limited to such an example.
  • the tag information may be any position information of the operating tool associated with the elapsed time during content reproduction, and the type thereof is not limited.
  • the input unit 110 may include a sensor device that detects the position of the operating tool in a space such as a stereo camera or an infrared camera, and position information in a three-dimensional space may be input as the position information of the operating tool.
  • the operation body When the position information of the operation body is position information in a three-dimensional space, the operation body may be, for example, a user's hand, and the user reproduces content according to the position of the hand in the left-right direction with respect to the sensor device of the input unit 110
  • the state may be controlled and tag information may be input according to the vertical position of the hand.
  • the user can perform the tagging process by moving his / her hand up / down / left / right with respect to the sensor device of the input unit 110 while referring to the content displayed on the display screen of the display unit 120.
  • the position information acquisition unit 141 may detect the locus of the position information of the operating tool as characters, numbers, symbols, and the like, so that the tag information may be acquired.
  • the position information acquisition unit 141 may detect the locus of the position information of the operating tool as a symbol having a binary meaning such as “ ⁇ ” or “ ⁇ ”. For example, while referring to the content of the content displayed on the display screen of the display unit 120, the user moves the operation body so that the locus of the position information of the operation body draws a shape of “ ⁇ ” in a favorite scene.
  • the tag information on which the presence or absence of binary preference is superimposed is input can do.
  • the tagging unit 144 may add tag information on which the presence / absence of such a binary preference is superimposed.
  • the embodiment in which the content data to which the tag information is attached is the moving image data has been described, but the present embodiment is not limited to such an example.
  • the content data to be subjected to the tagging process may be music data or slide show data in which still images are continuously displayed for a predetermined time.
  • the information processing apparatus 10 includes an audio output device including a speaker, headphones, and the like, and the user listens to audio included in the music data output from the audio output device.
  • tag information may be input.
  • tagging processing similar to that for moving image data may be performed.
  • a position information acquisition unit that acquires position information of an operating tool associated with an elapsed time during content reproduction, and tagging the content by adding the position information to the content as tag information.
  • An information processing apparatus comprising a tagging unit.
  • a display control unit configured to display the content of the content on a display screen, wherein the position information acquisition unit acquires position information of the operating body on the display screen, and the first direction is The information processing apparatus according to (2), wherein the display screen has a vertical direction with respect to the display screen, and the second direction has a horizontal direction with respect to the display screen.
  • the reproduction state control unit associates the position information of the operation body in the second direction with the elapsed time during the content reproduction, and corresponds to the position information of the operation body in the second direction.
  • the information processing apparatus according to (2) or (3) wherein the content is reproduced at a reproduction position.
  • the playback state control unit associates a range from one end to the other end in the second direction on the display screen with an arbitrary time range in the content, and the second on the display screen.
  • the content is played back at a playback position corresponding to the position information of the operating tool in the direction of, and the position information acquisition unit acquires the position information of the operating tool on the display screen.
  • Information processing device (6) The information processing apparatus according to (2) or (3), wherein the reproduction state control unit changes a reproduction speed of the content based on the position information of the operation body in the second direction. .
  • the playback state control unit associates the content from one end to the other end in the second direction on the display screen with the playback speed of the content, and in the second direction on the display screen.
  • the information processing apparatus wherein the content is reproduced at a reproduction speed corresponding to the position information of the operation tool.
  • the reproduction state control unit fast-forwards the content
  • control for rewinding the content is performed when the position information of the operating tool is acquired on the other side of the reference point.
  • the position information acquisition unit acquires position information of the operation tool for a part corresponding to an arbitrary time range in the content, and the tagging unit corresponds to the time range from which the position information is acquired.
  • the information processing apparatus according to any one of (1) to (9), wherein the tag information is assigned to the content for a portion to be processed.
  • (12) (1) to (11) further comprising: a display control unit that displays the content of the content on a display screen, wherein the display control unit displays a locus of position information of the operating body on the display screen.
  • a playback state control unit for controlling the playback state of the content, wherein the playback state control unit extracts a part of the content based on the tag information;
  • the information processing apparatus according to any one of claims.
  • the tagging unit uses position information of the operating body in a first direction as the tag information, and the reproduction state control unit is configured to output the operating body in the first direction with respect to the tag information.
  • the information processing apparatus according to (13) wherein position information and a score are associated with each other, and a portion of the content in which the score is equal to or greater than a predetermined threshold is extracted.
  • the reproduction state control unit determines the threshold based on a reproduction time of the extracted content.
  • Information processing device 17.
  • the position information acquisition unit acquires three-dimensional position information of the operating body in space.
  • a tagging method including: (20) Tagging the content by giving the computer the function of acquiring the position information of the operating tool associated with the elapsed time during the content reproduction, and adding the position information to the content as tag information Function and program to realize.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le problème décrit par la présente invention est de permettre de réaliser un étiquetage de contenus plus souple. La solution selon l'invention porte sur un appareil de traitement d'informations comprenant : une unité d'acquisition d'informations de position qui acquiert les informations de position d'un élément fonctionnel associé au temps écoulé qui est relatif à un contenu en cours de reproduction ; et une unité d'étiquetage ajoutant au contenu les informations de position, qui servent alors d'informations d'étiquette, et cela aboutit à l'étiquetage du contenu.
PCT/JP2014/050829 2013-04-04 2014-01-17 Appareil de traitement d'informations, procédé d'étiquetage et programme Ceased WO2014162757A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-078461 2013-04-04
JP2013078461 2013-04-04

Publications (1)

Publication Number Publication Date
WO2014162757A1 true WO2014162757A1 (fr) 2014-10-09

Family

ID=51658064

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/050829 Ceased WO2014162757A1 (fr) 2013-04-04 2014-01-17 Appareil de traitement d'informations, procédé d'étiquetage et programme

Country Status (1)

Country Link
WO (1) WO2014162757A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006279320A (ja) * 2005-03-28 2006-10-12 Canon Inc 番組蓄積再生装置、番組蓄積再生方法、その記録媒体及びプログラム
JP2007142571A (ja) * 2005-11-15 2007-06-07 Toshiba Corp コンテンツ再生装置とその再生速度制御方法
JP2010288015A (ja) * 2009-06-10 2010-12-24 Sony Corp 情報処理装置、情報処理方法及び情報処理プログラム
JP2012088688A (ja) * 2010-09-22 2012-05-10 Nikon Corp 画像表示装置
JP2012155695A (ja) * 2011-01-07 2012-08-16 Kddi Corp 動画コンテンツにおける注目シーンにキーワードタグを付与するプログラム、端末、サーバ及び方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006279320A (ja) * 2005-03-28 2006-10-12 Canon Inc 番組蓄積再生装置、番組蓄積再生方法、その記録媒体及びプログラム
JP2007142571A (ja) * 2005-11-15 2007-06-07 Toshiba Corp コンテンツ再生装置とその再生速度制御方法
JP2010288015A (ja) * 2009-06-10 2010-12-24 Sony Corp 情報処理装置、情報処理方法及び情報処理プログラム
JP2012088688A (ja) * 2010-09-22 2012-05-10 Nikon Corp 画像表示装置
JP2012155695A (ja) * 2011-01-07 2012-08-16 Kddi Corp 動画コンテンツにおける注目シーンにキーワードタグを付与するプログラム、端末、サーバ及び方法

Similar Documents

Publication Publication Date Title
US20200286185A1 (en) Parallel echo version of media content for comment creation and delivery
US7739584B2 (en) Electronic messaging synchronized to media presentation
JP6044079B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP5857450B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US10622021B2 (en) Method and system for video editing
US9083933B2 (en) Information processing apparatus, moving picture abstract method, and computer readable medium
US10325628B2 (en) Audio-visual project generator
US20180132006A1 (en) Highlight-based movie navigation, editing and sharing
US9558784B1 (en) Intelligent video navigation techniques
TW201132122A (en) System and method in a television for providing user-selection of objects in a television program
JP2012248070A (ja) 情報処理装置、メタデータ設定方法、及びプログラム
US9564177B1 (en) Intelligent video navigation techniques
JP5870742B2 (ja) 情報処理装置、システムおよび情報処理方法
JP2012186621A (ja) 情報処理装置、情報処理方法およびプログラム
WO2014069114A1 (fr) Dispositif de traitement d'informations, procédé de commande d'état de reproduction et programme
KR20160098949A (ko) 동영상 생성 방법, 장치 및 컴퓨터 프로그램
CN107925741A (zh) 图像处理方法、图像处理设备和程序
JP2013251667A (ja) 情報処理装置、および情報処理方法、並びにプログラム
US20200104030A1 (en) User interface elements for content selection in 360 video narrative presentations
JP2013171599A (ja) 表示制御装置および表示制御方法
CN105051820B (zh) 信息处理设备和信息处理方法
JP2016513390A (ja) 2d−3d複合次元コンテンツファイルを用いた複合次元コンテンツサービス提供システム、そのサービス提供方法及びその複合次元コンテンツファイル
JP2012175281A (ja) 録画装置及びテレビ受像装置
JP7365076B2 (ja) 映像配信装置、映像配信システム、映像配信方法、及びプログラム
KR20150048961A (ko) 핫 씬 서비스 시스템, 핫 씬 서비스 방법 및 이를 위한 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14778250

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14778250

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP