WO2015115253A1 - Dispositif de réception, procédé de réception, dispositif de transmission et procédé de transmission - Google Patents
Dispositif de réception, procédé de réception, dispositif de transmission et procédé de transmission Download PDFInfo
- Publication number
- WO2015115253A1 WO2015115253A1 PCT/JP2015/051443 JP2015051443W WO2015115253A1 WO 2015115253 A1 WO2015115253 A1 WO 2015115253A1 JP 2015051443 W JP2015051443 W JP 2015051443W WO 2015115253 A1 WO2015115253 A1 WO 2015115253A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- component
- service
- audio
- control signal
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
Definitions
- the present technology relates to a receiving device, a receiving method, a transmitting device, and a transmitting method, and more particularly, to a receiving device, a receiving method, a transmitting device, and a transmitting method that can flexibly cope with various operation modes. .
- MPEG2-TS Motion Picture Experts Group Phase 2-Transport Stream
- IP Internet Protocol
- IP transmission method By introducing the IP transmission method, it is expected that content in various formats can be transmitted in various distribution forms, and various operation forms can be used. The technical system to respond is not established.
- This technology has been made in view of such a situation, and is intended to be able to flexibly cope with various operation forms in digital broadcasting using an IP transmission method.
- a receiving apparatus includes a receiving unit that receives a broadcast wave of a digital broadcast using an IP (Internet Protocol) transmission method, and various services included in a control signal transmitted by the broadcast wave Based on information indicating the distribution form of the component that constitutes the component, the control unit that obtains the component constituting the selected service according to the distribution form and controls the operation of each part that performs a predetermined process relating to the obtained component A receiving device.
- IP Internet Protocol
- the distribution form of the component can be broadcast distribution or communication distribution.
- the control signal is transmitted for each service, and in the broadcast distribution, the component is distributed in the same service as the control signal of the selected service, and the component Can be included in a second broadcast distribution that is distributed in a different service different from the control signal of the selected service.
- the control signal includes a plurality of signaling information, and the component can be identified by a common ID in each signaling information.
- the component is a file format and can be transmitted by FLUTE (FileUTEDelivery over Unidirectional Transport) session.
- FLUTE FileUTEDelivery over Unidirectional Transport
- each segment in the FLUTE session can store data of a component of a specific category.
- Each segment in the FLUTE session stores data of a plurality of categories of components, and the control signal may include identification information for identifying the categories of the components.
- the control signal includes, as signaling information, a table in which parameters relating to at least one of various services and components constituting the service are described.
- the component ID and the category of the component are included as parameters relating to the component. Can be described.
- the control signal is transmitted in a layer higher than the IP layer among the protocol layers in the IP transmission scheme, and a common IP address is assigned to the component and the control signal constituting each service. can do.
- the receiving device may be an independent device or an internal block constituting one device.
- the reception method according to the first aspect of the present technology is a reception method corresponding to the reception device according to the first aspect of the present technology.
- a broadcast wave of digital broadcasting using the IP transmission method is received, and various services included in a control signal transmitted by the broadcast wave are configured.
- the components constituting the selected service are acquired according to the distribution form, and the operation of each unit that performs a predetermined process related to the acquired component is controlled.
- the transmission device includes a first acquisition unit that acquires one or more components constituting various services, and a second that acquires a control signal including information indicating a distribution form of the components. And a transmission unit that transmits a broadcast wave that uses the IP transmission method and includes the component that constitutes the service and the control signal.
- the distribution form of the component can be broadcast distribution or communication distribution.
- the control signal is transmitted for each service, and the broadcast distribution includes a first broadcast distribution in which the component is distributed in the same service as the control signal of the selected service;
- the second broadcast distribution distributed in a different service different from the control signal of the selected service may be included.
- the control signal includes a plurality of signaling information, and the component can be identified by a common ID in each signaling information.
- the component is in a file format and can be transmitted by a FLUTE session.
- each segment in the FLUTE session can store data of a component of a specific category.
- Each segment in the FLUTE session stores data of a plurality of categories of components, and the control signal may include identification information for identifying the categories of the components.
- the control signal includes, as signaling information, a table in which parameters relating to at least one of various services and components constituting the service are described.
- the component ID and the category of the component are included as parameters relating to the component. Can be described.
- the control signal is transmitted in a layer higher than the IP layer among the protocol layers in the IP transmission scheme, and a common IP address is assigned to the component and the control signal constituting each service. can do.
- the transmission device may be an independent device or an internal block constituting one device.
- the transmission method according to the second aspect of the present technology is a transmission method corresponding to the transmission device according to the second aspect of the present technology.
- one or a plurality of components constituting various services are acquired, a control signal including information indicating a distribution form of the components is acquired, and an IP transmission method A broadcast wave including the component constituting the service and the control signal is transmitted.
- the first aspect and the second aspect of the present technology it is possible to flexibly cope with various operation forms in digital broadcasting using the IP transmission method.
- FIG. 1 is a diagram showing a protocol stack for digital broadcasting of an IP transmission method.
- the lowest layer is a physical layer, and the frequency band of the broadcast wave allocated for the service (channel) corresponds to this.
- the upper layer adjacent to the physical layer is an IP layer with a BBP stream (Base Band Packet Stream) in between.
- the BBP stream is a stream including packets storing various data in the IP transmission method.
- the IP layer is equivalent to IP (Internet Protocol) in the TCP / IP protocol stack, and an IP packet is specified by an IP address.
- IP Internet Protocol
- the upper layer adjacent to the IP layer is a UDP layer, and further higher layers are RTP and FLUTE / ALS.
- UDP User Datagram Protocol
- RTP Real-time Transport Protocol
- FLUTE File Delivery over Unidirectional Transport
- the upper layer adjacent to FLUTE / ALS is fMP4 (Fragmented MP4), and the upper layer adjacent to RTP, fMP4 is video data (Video), audio data (Audio), subtitle data (Closed Caption), etc. Is done. That is, an RTP session is used when video data and audio data are transmitted in a stream format, and a FLUTE session is used when video data and audio data are transmitted in a file format.
- NRT content NRT Content
- ESG Electronic Service Service Guide
- SCS NRT content
- NRT content is content transmitted by NRT (Non-RealTime) broadcasting, and is reproduced after being temporarily stored in the storage of the receiver.
- NRT content is an example of content, and a file of another content may be transmitted by the FLUTE session.
- ESG Electronic Service Service Guide
- ESG Electronic Service Service Guide
- SCS Service Channel Signaling
- MPS Media Presentation Description
- SPT Service Parameter Table
- SDP Session Description Protocol
- USD User Service Description
- LLS Low Layer Signaling
- service configuration information such as SCT (Service Configuration Table), SAT (Service Association Table), EAT (Emergency Alerting Table), RRT (Region Rating Table), and the like is transmitted.
- FIG. 2 is a diagram showing a relationship between a broadcast wave signal of digital broadcasting using the IP transmission method and an ID system of the IP transmission method.
- a network wave (hereinafter also referred to as “network_id” or “networkId”) is assigned to a broadcast wave (broadcast network (Network)) having a predetermined frequency band (6 MHz).
- Each broadcast wave includes one or more BBP streams identified by a BBP stream ID (hereinafter also referred to as “BBP_stream_id” or “BBPStreamId”).
- BBP_stream_id BBP stream ID
- a BBP stream is composed of a plurality of BBP packets including a BBP header and a payload.
- Each BBP stream includes one or more services (Services) identified by service IDs (hereinafter also referred to as “service_id” and “ServiceId”).
- Service is composed of one or a plurality of components.
- Each component is information constituting a program such as video data, audio data, and caption data.
- IP transmission system ID system a combination of a network ID (network_id), a transport stream ID (transport_stream_id), and a service ID (service_id) used in the MPEG2-TS system (hereinafter referred to as “triplet”).
- the triplet represents the BBP stream configuration and the service configuration in the broadcast network.
- a BBP stream ID is used instead of the transport stream ID.
- FIG. 3 is a diagram showing a configuration of a broadcast wave of IP transmission type digital broadcasting.
- a plurality of BBP streams are transmitted on a broadcast wave (“Network” in the figure) having a predetermined frequency band (6 MHz).
- Each BBP stream includes NTP (Network Time Protocol), a plurality of service channels (Service Channel), an electronic service guide (ESG Service), and LLS.
- NTP Network Time Protocol
- Service Channel a plurality of service channels
- ESG Service electronic service guide
- LLS LLS
- NTP, service channel, and electronic service guide are transmitted according to the UDP / IP protocol, but LLS is transmitted on the BBP stream.
- NTP is time information and is common to a plurality of service channels.
- Each service channel includes components such as video data and audio data, and SCS such as SPT and SDP.
- SCS such as SPT and SDP.
- a common IP address is assigned to each service channel, and components, control signals, and the like can be packaged for each service channel or a plurality of service channels using this IP address.
- the network, the BBP stream (BBP Stream), and the component correspond to those in FIG. 2, but the service channel (Service Channel) is the service in FIG. It corresponds to.
- FIG. 4 is a diagram illustrating the configuration of the LLS in the IP transmission scheme.
- a BBP packet is composed of a BBP header and a payload.
- the payload portion is an IP packet.
- LLS when LLS is transmitted using a BBP stream, the LLS is placed next to the BBP header.
- LLS for example, SCT or SAT described in XML (Extensible Markup Language) format is arranged, but an SGDU header is added with the XML fragment (XML fragment) of a part of the data as the LLS body. .
- SCT and SAT are transmitted by SGDU container (ServiceDUGuide Delivery Unit Container).
- the BBP header contains 2-bit type information, and the type information can distinguish whether the BBP packet is an IP packet or an LLS.
- FIG. 5 is a diagram illustrating the configuration of the SCS in the IP transmission scheme.
- SCS Since SCS is transmitted using a FLUTE session, it is placed next to each BBP, IP, UDP, and LCT header.
- SPT and SDP are arranged as SCS, and an SGDU header is added with an SDP fragment (SDP fragment) of a part of the data as an SCS main body.
- SDP fragment SDP fragment
- the SCS main body is not limited to the SDP fragment.
- an SPT XML fragment (XML fragment) described in XML format can be arranged and transmitted by the SGDU container.
- FIG. 6 is a diagram illustrating the structure of the SGDU described in FIGS. 4 and 5.
- SGDU is adopted as a standard of OMA (Open Mobile Alliance).
- SGDU Service Guide Delivery Unit
- header information (Unit_Header) and payload (Unit_Payload).
- payload (Unit_Payload).
- extension information (extension_data) is arranged as necessary.
- fragmentTransportID indicates fragment identification.
- SCT and SDP are identified by fragmentTransportID.
- FragmentVersion indicates the version number of the fragment.
- the actual data of at least one of XML fragment (XML fragment) and SDP fragment (SDP fragment) is placed in the payload. That is, data of one or a plurality of fragments according to the number specified by the header information n_o_service_guide_fragments is arranged in the payload.
- a combination of a plurality of fragments arranged in the payload is arbitrary such that both of the XML fragment and the SDP fragment are arranged.
- the position of an arbitrary fragment among a plurality of arranged fragments can be indicated by the offset of the header information.
- fragmentType indicating the type of the fragment is arranged together with actual data.
- SDP fragment a fragment ID for identifying the fragment is arranged together with actual data.
- extension_type indicating the type of extension information is arranged together with the extension data (extension_data). Also, the location of the extension information can be indicated by specifying extension_offset in the header information.
- FIG. 7 is a diagram illustrating a structure of signaling information.
- SCT SCT
- SAT SAT
- EAT RRT
- the SCT employs a triplet used in the MPEG2-TS system, and the triplet indicates the BBP stream configuration and service configuration in the broadcast network.
- the SCT also includes information such as IP address as attribute / setting information for each service, bootstrap information for accessing ESG and SCS, and the like.
- SAT indicates on-air service for each BBP stream.
- the SAT can determine whether a particular service is on air (broadcasting).
- EAT contains information about emergency notifications.
- the RRT includes rating information.
- MPD MPD
- SPT Segment URL
- the SDP includes service attributes for each service, component configuration information, component attributes, component filter information, component location information, and the like.
- USD includes FDD (File Delivery Description), and a segment file transmitted by, for example, a FLUTE session is specified by SDP and USD.
- ESG is an electronic service guide including information such as program title and start time.
- the application (Application) is composed of a file in HTML (Hyper Text Markup Language) format, and is distributed from a server on the Internet, for example.
- the application is executed in conjunction with broadcast content such as a television program provided as a specific service.
- ESG and applications can be associated with USD.
- FIG. 8 is a diagram showing the structure of MPD (Media Presentation Description). The MPD is described in a markup language such as XML.
- the MPD is composed of a Period element, an AdaptationSet element, and a Representation element in a hierarchical structure.
- the MPD structure in FIG. 8 only main elements and attributes constituting the MPD are described, and actually other elements and attributes are described.
- the Period element is described in units that describe the composition of content and programs. For example, information such as language can be specified in the AdaptationSet element. In the Representation element, for example, information regarding the encoding speed and the screen size can be specified.
- the Representation element is a higher element of the id attribute and BaseURL element.
- a representation ID is specified. With this representation ID, it is possible to establish component correspondence with other signaling information (SPT, SDP, USD) of the SCS.
- SPT signaling information
- SDP SDP
- USD USD
- component location information is specified.
- FIG. 8 shows a case where one AdaptationSet element and one Representation element are arranged, but a plurality of AdaptationSet elements and Representation elements can be arranged.
- FIG. 9 is a diagram illustrating the syntax of an SPT (Service Parameter Table).
- SPT Service Parameter Table
- the SPT is described in a markup language such as XML.
- “@” is added to the attribute. Further, the indented element and attribute are specified for the upper element.
- the Spt element is a serviceId attribute, spIndicator attribute, ProtocolVersionDescriptor element, NRTServiceDescriptor element, CapabilityDescriptor element, IconDescriptor element, ISO639LanguageDescriptor element, ReceiverTargetingDescriptor element, AssociatedServiceDescriptor element, ContentAdvisoryDescriptor element, and upper elements of the Component element .
- the service ID is specified in the serviceId attribute.
- the spIndicator attribute whether or not each service identified by the service ID is encrypted is specified. When “on” is specified as the spIndicator attribute, it indicates that the service is encrypted, and when “off” is specified, it indicates that the service is not encrypted. .
- ProtocolVersionDescriptor element information indicating what kind of service the data service is specified.
- NRTServiceDescriptor element information related to the NRT service is specified.
- CapabilityDescriptor element information on a function (capability) required for a receiver that receives the provision of the NRT service is specified.
- IconDescriptor element information indicating the acquisition destination of the icon used in the NRT service is specified.
- the language code of the NRT service is specified in the ISO639LanguageDescriptor element.
- target information of the NRT service is specified.
- Information related to related subordinate services is specified in the AssociatedServiceDescriptor element.
- Information related to the rating region is specified.
- the Component element is an upper element of the componentId attribute, representationId attribute, subRepresentationLevel attribute, componentCategory attribute, locationType attribute, componentEncription attribute, TargetedDeviceDescriptor element, Content advisoryDescriptor element, VideoParameters element, AudioParameters element, and CaptionParameters element.
- the component ID is specified in the componentId attribute.
- the representation ID is specified in the representationId attribute. With this representation ID, it is possible to establish component correspondence with other signaling information (MPD, SDP, USD) of the SCS.
- the subrepresentation level is specified in the subRepresentationLevel attribute.
- the sub-representation level is information for identifying components of a plurality of categories (for example, video and audio) stored in each segment in the FLUTE session.
- the component category information is specified in the componentCategory attribute.
- this category information for example, “video”, “audio”, “caption”, “nrt” are designated. “video” indicates a video component, “audio” indicates an audio component, and “caption” indicates a subtitle component. “Nrt” indicates that the data is NRT content data.
- the locationType attribute specifies component location type information. As this type information, for example, “bb”, “bca”, “bco” are designated. “bb” is an abbreviation for Broadband and indicates that a component is distributed using communication. “bca” is an abbreviation for “Broadcast-actual”, and indicates that a component is distributed using broadcasting and distributed within the same service as the service in which the SPT (SCS) is transmitted. “bco” is an abbreviation for “Broadcast other”, and indicates that a component is distributed using broadcast and distributed in a service different from the service in which the SPT (SCS) is transmitted.
- bca is specified as the locationType attribute and the component is distributed within the same service is also referred to as “inband”.
- band is specified as the locationType attribute and the component is distributed within another service.
- componentEncription attribute it is specified whether each component identified by component ID is encrypted. If “on” is specified as the componentEncription attribute, it indicates that the component is encrypted. If “off” is specified, it indicates that the component is not encrypted. .
- TargetedDeviceDescriptor element information related to the target device is specified. Rating information for each component is specified in the ContentAdvisoryDescriptor element.
- Video parameters are specified in the VideoParameters element.
- the VideoParameters element is an upper element of the AVCVideoDescriptor element and the HEVCVideoDescriptor element. That is, when AVC (Advanced Video Coding) is used as the video data encoding method, an AVC Video Descriptor element is specified, and when HEVC (High Efficiency Video Coding) is used, the HEVC Video Descriptor element is It is specified.
- AVC and HEVC are examples of video data encoding schemes, and when other encoding schemes are used, corresponding VideoDescriptor elements are designated.
- Audio parameters are specified in the AudioParameters element.
- the AudioParameters element is an upper element of the MPEG4AACAudioDescriptor element and the AC3AudioDescriptor element. That is, when MPEG4AAC (Advanced Audio Coding) is used as the audio data encoding method, the MPEG4AACAudioDescriptor element is specified, and when AC3 (Audio Code number 3) is used, the AC3AudioDescriptor element is specified.
- the MPEG4AAC and AC3 are examples of audio data encoding schemes, and when other encoding schemes are used, corresponding AudioDescriptor elements are designated.
- the CaptionParameters element specifies subtitle parameters.
- ProtocolVersionDescriptor element the NRTServiceDescriptor element, the CapabilityDescriptor element, the IconDescriptor element, the ISO639 LanguageDescriptor element, and the ReceiverTargetingDescriptor element are defined for the NRT service.
- the number of occurrences (Cardinality) of SPT elements and attributes shown in FIG. 9 is shown.
- “1" is specified, only one element or attribute is specified, and "0..1” Is specified, whether or not to specify the element or attribute is arbitrary. If “1..n” is specified, one or more elements or attributes are specified. If “0..n” is specified, one or more elements or attributes are specified. It is optional. The meaning of the number of appearances is the same in other syntaxes described later.
- FIG. 10 is a diagram illustrating the syntax of the Associated Service Descriptor.
- the Associated Service Descriptor is described in, for example, a markup language such as XML.
- “@” is added to the attribute. Further, the indented element and attribute are specified for the upper element.
- the AssociatedServiceDescriptor element is an upper element of the networkId attribute, the BBPStreamId attribute, and the serviceId attribute.
- a network ID is specified in the networkId attribute.
- a BBP stream ID is specified in the BBPStreamId attribute.
- a service ID is specified in the serviceId attribute. That is, the related dependent service is specified by the triplet.
- FIG. 11 is a diagram showing the structure of SDP (Session Description Protocol). The SDP is described in a text format, for example.
- the SDP is composed of two parts, a session description part (Session Description) and a media description part (Media Description).
- Session Description Session Description
- Media Description media description part
- the session description part information related to the session is described.
- the media description unit can describe a plurality of media information such as audio data and video data transmitted in the RTP session or FLUTE session.
- the session description part includes protocol version (v), origin (o), session name (s), session information (i), URI (u), email address (e), phone number (p), connection data (c ), (Session) bandwidth (b), timing (t), repeat times (r), time zone (z), encryption keys (k), (session) attributes (a).
- Protocol version (v) specifies the protocol version. As this value, “0” or a value determined by service operation is designated. In RF2327, “0” is always specified.
- origin (o) the information of the creator of the SDP description document is specified. For example, as origin (o), user name (username), session ID (sess-id), session version (sess-version), network type (nettype), address type (addrtype), unicast address (unicast-address) ) Is specified.
- session name the name of the session is specified.
- session information information about the session is specified.
- URI Uniform Resource Identifier
- Email address (e) specifies the email address of the contact person in charge of session management.
- phone number (p) specifies the telephone number of the contact person in charge of the session management.
- connection data (c) information on the network address used in the session is specified. For example, information such as a network type (nettype), an address type (addrtype), and a connection address (connection-address) is specified as connection data (c).
- connection data (c) information on the network address used in the session is specified.
- information such as a network type (nettype), an address type (addrtype), and a connection address (connection-address) is specified as connection data (c).
- connection data (b) bandwidth used for the media used in the session is specified.
- Timing (t) specifies the effective start time and end time of the session. In repeat times (r), a repetition cycle or the like is specified when a session is periodically repeated.
- the timing description part is composed of timing (t) and repeat ⁇ times (r).
- Time zones (z) specifies the offset when it is necessary to switch between daylight saving time and winter time when repeat is specified in repeat times (r) of the time descriptor.
- encryption keys (k) the encryption key used in the session or its information is specified.
- session attributes (a) specifies various information related to a session.
- media announcements (m), media information (i), connection data (c), (media) bandwidth (b), encryption keys (k), (media) attributes can be described.
- media ⁇ announcements information such as media type (media), port number (port), protocol (proto), and format (fmt) is specified.
- media information information about the media is specified.
- connection data information on the network address used on the media is specified.
- (Media) bandwidth (b) specifies the bandwidth used by the media stream.
- encryption keys (k) an encryption key used for the media or its information is designated.
- attributes relating to media are specified in (media) attributes.
- FIG. 12 is a diagram illustrating SDP attribute types (Attributes).
- ptime ptime
- fmtp fmtp
- sendrecv recvonly
- sendonly inactive
- rtpmap rtpmap
- representation-id representation-id
- Ptime indicates the length of the media included in one packet, and the amount of data included in the packet is specified as the value.
- the fmtp indicates a format used in the media and specific parameters necessary for the format, and the format and parameters are specified as the value.
- Sendrecv indicates that the medium is bidirectional for transmission and reception. recvonly indicates that the media is only received. sendonly indicates that the medium is only for transmission. inactive indicates that the media is not sent in both directions. For example, it is used when media transmission / reception is suspended in the middle of a session or when a port number, codec, etc. are reserved before the session starts.
- Rtpmap indicates the mapping between the payload and the encoding type, and the payload type and the encoding type are specified as its value.
- the representation-id indicates the representation ID, and the representation ID is specified as the value. With this representation ID, it is possible to establish component correspondence with other signaling information (MPD, SPT, USD) of the SCS.
- FIG. 13 is a diagram illustrating a description example of SDP.
- the session description part contains "v” indicating the protocol version, "o” indicating the instance creator information, "c” indicating the connection data, and "t” indicating the session valid time. ing.
- IP address An IP address (URL) of “host.example.com” is specified for each address type.
- IP address information used in a specific session the network type that is "IN” (Internet), the IP address type that is “IP4" (IPv4), and the IP address that is "192.0.2.4" It is specified. Furthermore, “0 0” is specified as the session valid time.
- IP Internet
- IP6 IPv6
- the time scale of the RTP time stamp is 90000.
- FIG. 14 is a diagram showing the structure of USD (User Service Description). The USD is described in a markup language such as XML.
- the USBD element is a root element of USD, and the USD element can be described for each service. However, since the SCS for transmitting USD is provided for each service, only one USD element for each service is described in the USBD element.
- the USD element is an upper element of the appService element, broadcastAppService element, broadcastotherService element, broadcastAppService element, and DeliveryMethod element.
- the appService element is an upper element of the appServiceDescriptionURI element. MPD reference destination can be specified by this appService element and appServiceDescriptionURI element.
- the broadcastAppService element is a higher element of the basePattern element.
- basePattern element baseURL of component is specified.
- the base URL is described in the basePattern element.
- “Video1” and “Audio1” are described in the basePattern element as components distributed in-band.
- the broadcastotherService element is an upper element of the basePattern element where the baseURL of the component is specified.
- baseURL is described in the basePattern element.
- “Audio2” is described in the basePattern element as a component that is broadcast and distributed in the outband.
- the broadbandAppService element is an upper element of the basePattern element in which the baseURL of the component is specified.
- baseURL is described in the basePattern element.
- “Audio3” is described in the basePattern element as a component that is distributed by communication.
- the baseURL described in the MPD with the baseURL described in the broadcastAppService element, the broadcastotherService element, and the broadcastAppService element, what kind of distribution form the component specified by the baseURL described in the MPD Can be identified. Specifically, if the base URL of the MPD and broadcastAppService elements match, this indicates that the component is broadcast in-band, and if the base URL of the MPD and broadcastotherService elements match, the component is broadcast out-of-band. If the base URL of the MPD and the broadbandAppService element match, it indicates that the component is being delivered by communication.
- the broadcastAppService element, the broadcastotherService element, and the broadcastAppService element correspond to the type information specified in the locationType attribute of the Component element of the SPT (FIG. 9) described above, and not only in the SPT but also in the USD, The distribution form of the component can be identified.
- the DeliveryMethod element is a higher element of the SessionDescriptonURI element and the FDD element.
- the SDP reference destination can be specified by this DeliveryMethod element and SessionDescriptonURI element.
- the FDD (File Delivery Description) element contains TOI (Transport Object Identifier) that is identification information of multiple objects sent for each FLUTE session as index information for each TSI (Transport Session Identifier) that is identification information of each FLUTE session. ) Is described.
- the file to be transmitted is managed as one object by TOI.
- a set of a plurality of objects is managed as one session by TSI. That is, in the FLUTE session, a specific file can be designated by two pieces of identification information of TSI and TOI. Details of the FDD will be described later with reference to FIG.
- FIG. 15 is a diagram illustrating the syntax of FDD.
- the StaticFDD element is an upper element of the tsi attribute, objectDeliveryMode attribute, oufOfOrderSending attribute, contentEncoding attribute, contentEncoding attribute, byteRange attribute, representationId attribute, CodePoint element, File element, and FileTemplate element.
- ⁇ TSI is specified in the tsi attribute.
- objectDeliveryMode the distribution mode of the object is specified.
- Information related to distribution is specified in the oufOfOrderSending attribute.
- contentEncoding attribute information related to content encoding is specified.
- byteRange attribute information related to the byte range of the file is specified.
- the representation ID is specified in the representationId attribute. For example, this representation ID is used when a FLUTE session is used only for a specific component (eg, video). Also, with this representation ID, it is possible to establish a component correspondence with other signaling information (MPD, SDP, SDP) of the SCS. Information related to the LCT header is specified in the CodePoint element.
- the File element is an upper element of the contentLocation attribute, TOI attribute, contentEncoding attribute, contentMD5 attribute, and representationId attribute.
- the URL of the file is specified in the contentLocation attribute.
- TOI attribute specifies TOI.
- information related to content encoding is specified.
- information on MD5 (Message ⁇ Digest 5) is specified.
- the representation ID is specified in the representationId attribute.
- this representation ID is used when video and audio files are included in one FLUTE session and it is necessary to distinguish them in file units. Also, with this representation ID, it is possible to establish a component correspondence with other signaling information (MPD, SDP, SDP) of the SCS.
- the FileTemplate element is a higher element of the startTOI attribute and endTOI attribute.
- the startTOI attribute specifies the TOI start value when the TOI changes in time series.
- the endTOI attribute specifies the end value of the TOI when the TOI changes in time series. That is, by specifying the startTOI attribute and the endTOI attribute, the values are sequentially incremented from the TOI start value to the end value. This eliminates the need to obtain an FDD each time the TOI changes.
- the data structure of the SCS signaling information (MPD, SPT, SDP, USD) described above is merely an example, and other structures can be adopted.
- FIG. 16 is a diagram illustrating an operation example (hereinafter referred to as “operation example 1”) in the case of using a service-unit FLUTE session as the first embodiment.
- video and audio components are distributed using both broadcast and communication.
- a service 1 (Serice1) as a main service (main) and a service 2 (Service2) as an associated subordinate service (sub) are distributed using broadcasting, and an audio 3 (Audio3) is distributed. It is distributed using communication.
- components of video 1 (Video 1) and audio 1 (Audio 1) constituting service 1 are in-band, and components of audio 2 (Audio 2) constituting service 2 are out-of-band. Become.
- Service 1 (Serice 1) is, for example, a television program, and is composed of video 1 and audio 1. These components are transmitted in the same FLUTE session.
- MPD, SPT, SDP, and USD are transmitted as the SCS for service 1, and are acquired using the bootstrap information of the SCT transmitted as LLS.
- “10” is specified in the segment of video 1 as the startTOI attribute of the USD FDD element. Therefore, by sequentially incrementing the TOI value from “10” to “11”, “12”,..., N, it is possible to specify the TOI that changes in time series for each segment.
- “1010” is specified as the startTOI attribute for the audio 1 segment. Therefore, by sequentially incrementing the TOI value from “1010” to “1011”, “1012”,..., M, it is possible to specify the TOI that changes in time series for each segment.
- the audio 3 component is streamed from a distribution server provided on the Internet.
- the components of the audio 3 are divided into segments such as, for example, Segment 2010, Segment 2011, Segment 2012,.
- category information and location type information are specified for each component as service components, and the distribution form of each component can be specified.
- “bca” is designated as the location type information for the video 1 and audio 1 components, indicating that the components are broadcast in-band.
- “bco” is designated for the component of audio 2, and this component indicates that the component is being broadcasted out of band.
- “bb” is designated for the audio 3 component, which indicates that the component is distributed by communication.
- the MPD of service 1 describes URLs of components of video 1, audio 1, audio 2, and audio 3 as service components.
- the SDP of service 1 describes component configuration information, location information (for example, port number and TSI), and the like.
- the SPT of service 1 can specify that the video 1 and audio 1 components are broadcast in-band, so refer to the FDD element of service 1 USD.
- the segments specified by the TOI that changes in time series are sequentially specified starting from the startTOI of the FDD element of the service 1 USD. You can get the component.
- the SPT of service 1 can identify that the audio 2 component is being broadcasted out-of-band, and can recognize that the service 2 is being transmitted by the Associated Service Descriptor described in the SPT.
- the component of audio 3 can be identified by communication by the SPT of service 1, it is possible to access the distribution server (Server) according to the URL of audio 3 described in the MPD of service 1.
- the component of the audio 3 stream-distributed from the distribution server can be acquired.
- Service 1 and service 2 are distinguished by service_type specified for each service in the SCT.
- the location type information is not limited to SPT, and may be specified with reference to USD.
- FIG. 18 is a diagram illustrating a description example of the MPD in FIG. Note that MPD is described in XML, so the actual description is different from this example, but expresses and describes the hierarchical structure and attributes of XML elements.
- the MPD describes a representation ID and URL (BaseURL) of each component as service components.
- Video1 is specified as BaseURL for video 1, and the character string obtained by combining the URL of the distribution server ("http://10.1.200.10/”) and "Video1" (“http://10.1.200.10/Video1”) is the segment URL.
- Audio 1 is specified as the BaseURL, and the segment URL is “http://10.1.200.10/Audio1”.
- “Audio2” is specified as the BaseURL, and the segment URL is “http://10.1.200.10/Audio2”.
- “Audio3” is designated as the BaseURL for the audio 3
- the segment URL is “http://10.1.200.10/Audio3”.
- FIG. 19 is a diagram illustrating a description example of the SDP in FIG.
- FIG. 21 is a diagram showing an example of the structure of USD in FIG.
- MPD (FIG. 18) is referred to by appService element and appServiceDescriptionURI element.
- the representation ID and BaseURL (“Video1”) of the component of video 1 and the representation ID and BaseURL (“Audio1”) of the component of audio 1 are described.
- descriptions of the representation ID and BaseURL of the audio 2 and audio 3 components are omitted.
- the matching may be performed using the representation ID of each component of the MPD and the representation ID of the FDD element of USD instead of location information such as BaseURL and contentLocation attribute.
- each signaling information (MPD, SPT, SDP, USD (FDD)) includes a representation for commonly identifying a specific component including in-band and out-band. An ID is set, and each signaling information can be referred to using this representation ID.
- the segment URL (BaseURL) of MPD and the content_location of USD (FDD) correspond to each other, the location information can be referred to each other.
- the startTOI of the video 1 segment can be acquired by matching the segment URL (BaseURL) of the MPD with the content_location of the USD (FDD). Then, from the FLUTE session specified by the SDP port number (port_num) and the TSI, the segment specified by the TOI that changes in time series with the startTOI as the start value is sequentially specified to acquire the component of the video 1 Can do.
- the sub-representation level is not necessary, so the description is omitted in MPD and SPT.
- the description is omitted in MPD and SPT.
- IP_address IP address
- FIG. 23 is a diagram illustrating a configuration example of each segment in the FLUTE session of the operation example 1 of FIG.
- Each segment of the FLUTE session is configured by an ISO base media file format defined by ISO / IEC 14496-12.
- the video 1 segment is composed of an initialization segment (Initialization Segment) and a media segment (Media Segment).
- the initialization segment includes initialization information such as a data compression method.
- the media segment stores the video 1 component.
- the initialization segment consists of ftyp and moov.
- ftyp file type box
- moov moov
- the media segment consists of styp, sidx, ssix, moof, and mdat.
- styp indicates the file format specification version of the file in segment units.
- sidx shows the index information in a segment.
- ssix indicates index information for each subsegment (level) in the segment.
- “Moof (movie fragment box)” indicates fragment control information.
- moof includes traf. traf (track fragment box) indicates control information for each track.
- control information for each track of video 1 is arranged in traf.
- mdat media data box indicates the media data body of the fragment.
- video 1 data (file) is arranged in mdat.
- the audio 1 segment is basically configured in the same manner as the video 1 segment. However, since the media segment is an audio 1 segment, control information for each track of the audio 1 is arranged in the traf of the moof. Audio 1 data (file) is arranged in mdat.
- FIG. 23 shows a description example of MPD corresponding to the configuration of each segment of video 1 and audio 1 described above.
- the Representation elements of video 1 and audio 1 describe Initialization1Segment URL and a plurality of Each Media Segments URL.
- Initialization Segment URL describes the URL for identifying the initialization segment.
- the URL for identifying each media segment is described in the “Each Media Segments URL”.
- FIG. 24 is a diagram illustrating an operation example (hereinafter referred to as “operation example 2”) when a component-unit FLUTE session is used as the second embodiment.
- video and audio components are distributed using both broadcasting and communication.
- a service 1 as a main service and a service 2 as a related subordinate service are distributed using broadcasting, and an audio 3 is distributed using communication.
- the components of video 1 and audio 1 constituting service 1 are in-band, and the components of audio 2 constituting service 2 are out-of-band.
- the service 1 is, for example, a TV program, and is composed of a video 1 and an audio 1, but these components are transmitted in different FLUTE sessions.
- MPD, SPT, SDP, and USD are transmitted as the SCS for service 1, and are acquired using the bootstrap information of the SCT transmitted as LLS.
- the component of video 1 is transmitted, but the component is divided for each segment, and the file of each segment is specified by TOI.
- TOI For example, in the segment of video 1, “10” is specified as the startTOI attribute of the USD FDD element, and the TOI values are sequentially changed from “10” to “11”, “12”,.
- TOI By incrementing to n, it is possible to specify the TOI that changes in time series for each segment.
- the audio 2 component is transmitted in the audio 2 FLUTE session.
- the component is divided into segments, and the file of each segment is designated by the TOI.
- “10” is specified as the startTOI attribute of the USD FDD element, and the TOI values are sequentially changed from “10” to “11”, “12”,.
- By incrementing with m it is possible to specify a TOI that changes in time series for each segment.
- SDP and USD are transmitted as SCS for service 2.
- SCS SCS for service 2.
- “10” is specified as the startTOI attribute for the audio 2 segment, and the TOI values are sequentially set to “11”, “12”,. • By incrementing with p, TOI that changes in time series for each segment can be specified.
- the audio 3 component is streamed from a distribution server provided on the Internet.
- the components of the audio 3 are divided into segments such as, for example, Segment 2010, Segment 2011, Segment 2012,.
- category information and location type information are specified for each component as a service component, and the distribution form of the component can be specified.
- “bca” is specified for the components of video 1 and audio 1 as location type information, and this component indicates that the component is broadcast in-band.
- “bco” is designated for the component of audio 2, and this component indicates that the component is being broadcasted out of band.
- “bb” is designated for the audio 3 component, which indicates that the component is distributed by communication.
- the FLUTE session is different for each component, that is, when the service 1 is selected because the video 1 component and the audio 1 component are transmitted in different FLUTE sessions.
- the video 1 component and the audio 1 component can be acquired by sequentially acquiring all the segments transmitted in each FLUTE session.
- each FLUTE session is specified by the SDP (port number and TSI) of service 1, but all segments are identified. There is no need to use TOI for acquisition.
- URLs of components of video 1, audio 1, audio 2, and audio 3 are described as service components.
- matching between MPD and USD can be performed using the representation ID and location information.
- the audio 2 component is acquired by sequentially acquiring all the segments transmitted in the FLUTE session. Can do.
- the FLUTE session is specified by the SDP (port number and TSI) of the service 2 as indicated by the solid lines from the SPT “Audio2” of the service 1 and the SDP “Audio2” of the service 2 in the figure.
- SDP port number and TSI
- the component of audio 3 can be identified by communication by the SPT of service 1, it is possible to access the distribution server (Server) according to the URL of audio 3 described in the MPD of service 1.
- the component of the audio 3 stream-distributed from the distribution server can be acquired.
- FIG. 26 is a diagram illustrating a description example of the SDP in FIG.
- FIG. 27 is a diagram showing an example of the structure of USD in FIG.
- the video 1 and audio 1 components constituting the service 1 have different FLUTE sessions, that is, two FLUTE sessions of a video 1 FLUTE session and an audio 1 FLUTE session. Because it exists, the USD describes two FDDs for each FLUTE session.
- the MPD (FIG. 18) is referred to by the appService element and the appServiceDescriptionURI element.
- the representation ID (“23”) and BaseURL (“Video1”) of the component of video 1 and the representation ID (“45”) and BaseURL (“Audio1”) of the component of audio 1 are described. ing.
- matching may be performed using the representation ID of each component of MPD and the representation ID of each FDD instead of location information such as BaseURL and contentLocation attribute.
- each signaling information (MPD, SPT, SDP, USD (FDD)) includes a representation for commonly identifying a specific component including in-band and out-band. An ID is set, and each signaling information can be referred to using this representation ID.
- the segment URL (BaseURL) of MPD and the content_location of USD (FDD) correspond to each other, the location information can be referred to each other.
- the startTOI of the video 1 segment can be acquired by matching the segment URL (BaseURL) of the MPD with the content_location of the USD (FDD). Then, from the FLUTE session specified by the SDP port number (port_num) and the TSI, the segment specified by the TOI that changes in time series with the startTOI as the start value is sequentially specified to acquire the component of the video 1 Can do.
- the sub-representation level is not necessary, so the description is omitted in the MPD and SPT.
- Segment configuration example The configuration of each segment in the FLUTE session of the operation example 2 in FIG. 24 is the same as the segment configuration in the operation example 1 of FIG. 16 described with reference to FIG. Omitted.
- FIG. 29 is a diagram illustrating an operation example (hereinafter referred to as “operation example 3”) in the case where a plurality of components are included in one segment, as the third embodiment. is there.
- video and audio components are distributed using both broadcasting and communication.
- a service 1 as a main service and a service 2 as a related subordinate service are distributed using broadcasting, and an audio 3 is distributed using communication.
- the service 1 is, for example, a television program and is composed of a video 1 and an audio 1, and the video 1 component and the audio 1 component are merged into one segment.
- video 1 and audio 1 included in one segment are also referred to as “video / audio 1”.
- MPD, SPT, SDP, and USD are transmitted as the SCS for service 1, and are acquired using the bootstrap information of the SCT transmitted as LLS.
- the video 1 and audio 1 files are merged and divided and transmitted for each segment.
- the file for each segment is specified by the TOI.
- “10” is specified as the startTOI attribute of the USD FDD element in the segment in which video 1 and audio 1 are merged. Therefore, by sequentially incrementing the TOI value from “10” to “11”, “12”,..., N, it is possible to specify the TOI that changes in time series for each segment.
- the audio 3 component is streamed from a distribution server provided on the Internet.
- the components of the audio 3 are divided into segments such as, for example, Segment 2010, Segment 2011, Segment 2012,.
- category information and location type information are specified for each component in the SPT of service 1 as a component of the service, and the distribution destination of the component can be specified.
- a sub-representation level (sub representation level) is designated in the SPT of service 1, and each component can be identified when a plurality of components are included in one segment.
- “bca” is specified for the component of the video / audio 1 as the location type information, and this component indicates that the component is broadcast in-band.
- “bco” is designated for the component of audio 2, and this component indicates that the component is being broadcasted out of band.
- “bb” is designated for the audio 3 component, which indicates that the component is distributed by communication.
- the SDP of service 1 describes component configuration information and location information (for example, port number and TSI).
- the SPT of service 1 can specify that the component of video / audio 1 is broadcast in-band, so refer to the FDD element of USD of service 1,
- the startTOI “10” of the segment of the video / audio 1 can be acquired.
- the video 1 and audio 1 components can be identified from the video / audio 1 segment by using the SPT and MPD sub-representation levels of the service 1.
- the SPT of service 1 identifies that the component of audio 2 is broadcasted out-of-band, and recognizes that the component 2 is transmitted by service 2 using the Associated Service Descriptor described in the SPT.
- the component of audio 3 can be identified by communication by the SPT of service 1, it is possible to access the distribution server (Server) according to the URL of audio 3 described in the MPD of service 1.
- the component of the audio 3 stream-distributed from the distribution server can be acquired.
- FIG. 31 is a diagram illustrating a description example of the MPD in FIG.
- the representation ID and URL (BaseURL) of each component are described as service components.
- video / audio 1 in which “VA1” is specified as BaseURL includes the components of video 1 and audio 1 in one segment. Therefore, the sub-representation level is specified, and video 1 and audio 1 are specified. It is possible to identify the components.
- component the sub-representation level is for is specified by the contentType attribute of the ContentComponent element.
- FIG. 32 is a diagram illustrating a description example of the SDP of FIG.
- representation-id: 23 indicates that the representation ID of the component of the video / audio 1 transmitted in the FLUTE session is “23”.
- FIG. 33 shows an example of the structure of USD in FIG.
- the MPD (FIG. 31) is referred to by the appService element and the appServiceDescriptionURI element.
- the video / audio 1 representation ID and BaseURL (“VA1"), as well as the sub-representation levels of video 1 and audio 1 are described.
- the startTOI "10" of the segment of the video / audio 1 can be acquired.
- the MPD representation ID and the USD FDD element representation ID may be used instead of the location information such as BaseURL and contentLocation attribute.
- each signaling information includes a representation for commonly identifying a specific component including in-band and out-band.
- An ID is set, and each signaling information can be referred to using this representation ID.
- the MPD segment URL (BaseURL) and the USD (FDD) content_location correspond, it is possible to refer to each other for their location information.
- the startTOI of the video / audio 1 segment is obtained by matching the MPD segment URL (BaseURL) with the USD_FDD content_location. Can do.
- the segment specified by the TOI that changes in time series with the startTOI as the start value is sequentially specified, so that the video 1 and audio 1 components are identified. Can be acquired.
- the sub-representation level (sub ⁇ representation level) of SPT and MPD is used. From the audio 1 segment, the video 1 and audio 1 components can be identified.
- FIG. 35 is a diagram illustrating a configuration example of each segment in the FLUTE session of the operation example 3 in FIG.
- Each segment of the FLUTE session is configured by an ISO base media file format defined by ISO / IEC 14496-12.
- video 1 (Video 1) and audio 1 (Audio 1) are included in one segment and transmitted in the same FLUTE session. Therefore, in FIG. Only the segment of Audio1) is shown.
- the video / audio 1 segment consists of an initialization segment (Initialization Segment) and a media segment (Media Segment).
- the initialization segment includes initialization information such as a data compression method.
- the media segment stores a video 1 or audio 1 component.
- the initialization segment consists of ftyp and moov.
- moov includes leva.
- leva level assisment box
- level / track mapping information indicates level / track mapping information.
- the track moid of the moof traf of the media segment and the sub-representation level (level) matching information of each component of the MPD are described.
- the media segment consists of styp, sidx, ssix, moof, and mdat.
- styp, sidx, and ssix are header information.
- moof indicates fragment control information.
- moof includes traf. traf indicates control information for each track.
- mdat indicates the media data body of the fragment.
- FIG. 35 shows a description example of the MPD corresponding to each segment configuration of the video / audio 1 described above.
- the Representation element of Video / Audio 1 includes, as a SubRepresentation element for each component included in one segment, in addition to Initialization Segment URL and multiple Each Media Segments URL. Level attribute and contentcomponent attribute are described. In the ContentComponent element, an id attribute and a contentType attribute are described.
- the ID specified in the contentcomponent attribute of the SubRepresentation element and the ID specified in the id attribute of the ContentComponent element are associated, and the SubRepresentation element is determined by the type information (eg, video or audio) specified by the contentType attribute.
- the type of each component is specified.
- the media segment The video 1 and the audio 1 stored in the data can be identified and their data can be obtained.
- FIG. 36 is a diagram showing another configuration example of each segment in the FLUTE session of the operation example 3 of FIG.
- video 1 (Video 1) and audio 1 (Audio 1) are included in one segment and transmitted in the same FLUTE session. Therefore, in FIG. Only the segment of Audio1) is shown.
- the initialization segment consists of ftyp and moov.
- moov includes leva.
- leva the track segment id of the media segment's moof traf and the sub-representation level (level) matching information of each component of the MPD are described.
- control information of video 1 and the control information of audio 1 are arranged in the traf of moof.
- video 1 data and audio 1 data are arranged. That is, in the segment configuration of FIG. 36, video 1 and audio 1 data are merged in one file.
- FIG. 36 shows a description example of the MPD corresponding to the segment structure of the video / audio 1 described above, but since the content is the same as the MPD of FIG. 35, the description thereof is omitted.
- video 1 and audio 1 are included in one segment and transmitted in the same FLUTE session, data of video 1 and audio 1 are merged in one file.
- the video 1 and audio 1 stored in the media segment can be identified and their data can be obtained.
- FIG. 37 is a diagram illustrating a configuration of an embodiment of a broadcast communication system to which the present technology is applied.
- the broadcast communication system 1 includes a transmission device 10, a reception device 20, and a distribution server 60. Further, the receiving device 20 and the distribution server 60 are connected to each other via the Internet 90.
- the transmission device 10 transmits broadcast content such as a TV program by a broadcast wave of digital broadcasting using an IP transmission method.
- Broadcast content is composed of components such as video, audio, and subtitles.
- the transmission device 10 transmits a control signal (signaling information in FIG. 7) together with the component using a broadcast wave of digital broadcasting.
- the receiving device 20 receives a broadcast signal transmitted from the transmitting device 10.
- the receiving device 20 acquires components such as video, audio, and subtitles based on a control signal obtained from the broadcast signal, and outputs video and audio of broadcast content such as a TV program.
- the reception device 20 may be configured as a single unit including a display and a speaker, or may be incorporated in a television receiver, a video recorder, or the like.
- the distribution server 60 performs streaming distribution of communication contents such as broadcast programs that have already been broadcast and movies that have been released.
- the communication content is composed of components such as video, audio, and subtitles.
- the receiving device 20 acquires components such as video, audio, and subtitles that are stream-distributed from the distribution server 60 via the Internet 90, and video and audio of communication content such as broadcast programs that have already been broadcast. Is output.
- the broadcast communication system 1 is configured as described above. Next, with reference to FIGS. 38 to 39, the detailed configuration of each apparatus constituting the broadcast communication system 1 of FIG. 37 will be described.
- FIG. 38 is a diagram illustrating a configuration of an embodiment of a transmission device to which the present technology is applied.
- the transmission device 10 includes a video data acquisition unit 111, a video encoder 112, an audio data acquisition unit 113, an audio encoder 114, a caption data acquisition unit 115, a caption encoder 116, a file data acquisition unit 117, and file processing.
- the video data acquisition unit 111 acquires video data as a component from a built-in storage, an external server, a camera, and the like, and supplies the video data to the video encoder 112.
- the video encoder 112 encodes the video data supplied from the video data acquisition unit 111 in accordance with an encoding method such as MPEG, and supplies the encoded data to the Mux 121.
- the audio data acquisition unit 113 acquires audio data as a component from a built-in storage, an external server, a microphone, or the like, and supplies the audio data to the audio encoder 114.
- the audio encoder 114 encodes the audio data supplied from the audio data acquisition unit 113 in accordance with an encoding method such as MPEG, and supplies the encoded audio data to the Mux 121.
- the subtitle data acquisition unit 115 acquires subtitle data as a component from a built-in storage or an external server and supplies the subtitle data to the subtitle encoder 116.
- the caption encoder 116 encodes the caption data supplied from the caption data acquisition unit 115 in accordance with a predetermined encoding method, and supplies the encoded data to the Mux 121.
- the file data acquisition unit 117 acquires file data such as video, audio, subtitles, NRT content, and applications from an internal storage or an external server, and performs file processing. To the unit 118.
- the file processing unit 118 performs predetermined file processing on the file data supplied from the file data acquisition unit 117 and supplies the file data to the Mux 121. For example, the file processing unit 118 performs file processing for transmitting the file data acquired by the file data acquisition unit 117 through the FLUTE session.
- the control signal acquisition unit 119 acquires a control signal (signaling information in FIG. 7) from a built-in storage or an external server and supplies the control signal to the control signal processing unit 120.
- the control signal processing unit 120 performs predetermined signal processing on the control signal supplied from the control signal acquisition unit 119 and supplies the signal to the Mux 121. For example, the control signal processing unit 120 performs signal processing for transmission through the FLUTE session on the SCS acquired by the control signal acquisition unit 119.
- the Mux 121 multiplexes the video data from the video encoder 112, the audio data from the audio encoder 114, the caption data from the caption encoder 116, the file data from the file processing unit 118, and the control signal from the control signal processing unit 120. Then, an IP transmission format BBP stream is generated and supplied to the transmission unit 122.
- the transmission unit 122 transmits the BBP stream supplied from the Mux 121 as a broadcast signal via the antenna 123.
- FIG. 39 is a diagram illustrating a configuration of an embodiment of a reception device to which the present technology is applied.
- the receiving apparatus 20 includes a tuner 212, a Demux 213, a clock generator 214, a selection / synthesis unit 215, a selection / synthesis unit 216, a selection / synthesis unit 217, a FLUTE processing unit 218, a storage 219, and a control unit. 220, NVRAM 221, communication I / F 222, Demux 223, video decoder 224, video output unit 225, audio decoder 226, audio output unit 227, and subtitle decoder 228.
- the tuner 212 extracts and demodulates the broadcast signal of the service instructed to be selected from the broadcast signal received by the antenna 211 in accordance with the control from the control unit 220, and obtains the IP transmission format BBP stream obtained as a result. , Supplied to Demux 213.
- the Demux 213 converts the IP transmission format BBP stream supplied from the tuner 212 into video data, audio data, caption data, file data, a control signal (signaling information in FIG. 7), and the like. Separate and output to subsequent block.
- the Demux 213 includes a BBP filter 251, an IP filter 252, a UDP filter 253, an LCT filter 254, and an SGDU filter bank 255.
- the BBP filter 251 performs a filtering process based on the BBP header and supplies the LLS to the SGDU filter bank 255.
- the IP filter 252 performs a filtering process based on the IP header. Further, the UDP filter 253 performs a filtering process based on the UDP header.
- the LCT filter 254 performs a filtering process based on the LCT header.
- NTP is supplied to the clock generation unit 214
- SCS is supplied to the SGDU filter bank 255.
- video data, audio data, and caption data as components are supplied to the selection / synthesis unit 215, the selection / synthesis unit 216, and the selection / synthesis unit 217, respectively.
- Various file data are supplied to the FLUTE processing unit 218.
- the SGDU filter bank 255 performs filtering processing based on the SGDU header, and supplies LLS (for example, SCT, SAT, etc.) and SCS (for example, SPT, SDP, etc.) to the control unit 220 as appropriate.
- LLS for example, SCT, SAT, etc.
- SCS for example, SPT, SDP, etc.
- the clock generator 214 generates a clock signal based on the NTP supplied from the Demux 213 according to the control from the control unit 220, and supplies the clock signal to the video decoder 224, the audio decoder 226, and the caption decoder 228.
- the FLUTE processing unit 218 restores video, audio, subtitles, NRT content, and application files from the file data supplied from the Demux 213 in accordance with control from the control unit 220.
- the FLUTE processing unit 218 records the restored video and audio file data in a storage 219 that is a large-capacity recording device such as an HDD (Hard Disk Drive).
- HDD Hard Disk Drive
- the FLUTE processing unit 218 supplies the video, audio, and subtitle file data restored as components to the selection / synthesis unit 215, the selection / synthesis unit 216, and the selection / synthesis unit 217, respectively.
- the file data may be data stored in the storage 219.
- the FLUTE processing unit 218 supplies the SCS supplied from the Demux 213 to the control unit 220.
- the SCS may be directly supplied from the Demux 213 to the control unit 220 without going through the FLUTE processing unit 218.
- the control unit 220 controls the operation of each unit constituting the receiving device 20 based on a control signal (signaling information in FIG. 7) supplied from the Demux 213 or the FLUTE processing unit 218.
- the NVRAM 221 is a nonvolatile memory, and records various data according to control from the control unit 220. For example, the control unit 220 records channel selection information obtained from the control signal (signaling information in FIG. 7) in the NVRAM 221. Then, the control unit 220 controls the channel selection process by the tuner 212 based on the channel selection information recorded in the NVRAM 221.
- the communication I / F 222 receives stream data of communication content that is stream-distributed via the Internet 90 from the distribution server 60 in accordance with control from the control unit 220, and supplies it to the Demux 223.
- the Demux 223 separates the stream data supplied from the communication I / F 222 into video data, audio data, and caption data as components, and sends them to the selection / synthesis unit 215, the selection / synthesis unit 216, and the selection / synthesis unit 217, respectively. Supply.
- the selection / synthesis unit 215 is supplied with video data (video component) from the Demux 213, the FLUTE processing unit 218, and the Demux 223.
- the selection / combination unit 215 performs selection / combination processing (for example, processing of each layer in the video component layer in FIG. 42 described later) on the video data (video components) in accordance with control from the control unit 220.
- the video data obtained as a result of the processing is supplied to the video decoder 224.
- the video decoder 224 corresponds to the video encoder 112 (FIG. 38) the video data supplied from the selection / synthesis unit 215 based on the clock signal supplied from the clock generator 214 in accordance with the control from the control unit 220.
- the video is decoded by the decoding method and supplied to the video output unit 225.
- the video output unit 225 outputs the video data supplied from the video decoder 224 to a subsequent display (not shown) according to the control from the control unit 220. Thereby, for example, a video of a television program is displayed on the display.
- the audio data (audio component) is supplied to the selection / synthesis unit 216 from the Demux 213, the FLUTE processing unit 218, and the Demux 223.
- the selection / synthesis unit 216 performs selection / synthesis processing (for example, processing of each layer in the audio component layer of FIG. 42 described later) on the audio data (audio component) in accordance with control from the control unit 220.
- the audio data obtained as a result of the processing is supplied to the audio decoder 226.
- the audio decoder 226 corresponds to the audio encoder 114 (FIG. 38) the audio data supplied from the selection / synthesis unit 216 based on the clock signal supplied from the clock generator 214 in accordance with the control from the control unit 220.
- the data is decoded by the decoding method and supplied to the audio output unit 227.
- the audio output unit 227 supplies the audio data supplied from the audio decoder 226 to a subsequent speaker (not shown) in accordance with the control from the control unit 220. Thereby, for example, audio synchronized with the video of the TV program is output from the speaker.
- Subtitle data (subtitle component) is supplied to the selection / synthesis unit 217 from the Demux 213, the FLUTE processing unit 218, and the Demux 223.
- the selection / synthesis unit 217 performs selection / synthesis processing (for example, processing of each layer in the caption component layer (in the video component layer in FIG. 42) on the caption data (subtitle component) according to control from the control unit 220. Processing similar to the processing of each layer)) is performed, and the caption data obtained as a result of the processing is supplied to the caption decoder 228.
- the subtitle decoder 228 decodes the subtitle data supplied from the Demux 213 with a decoding method corresponding to the subtitle encoder 116 (FIG. 38) based on the clock signal supplied from the clock generator 214 according to the control from the control unit 220. Then, it is supplied to the video output unit 225. In accordance with control from the control unit 220, the video output unit 225, when subtitle data is supplied from the subtitle decoder 228, synthesizes the subtitle data with the video data from the video decoder 224, and displays it on a subsequent display (not shown). Supply. Thereby, the subtitle synchronized with the video is displayed on the display together with the video of the television program.
- the configuration in which the selection / synthesis units 215 to 217 are provided in the preceding stage of each decoder is shown. However, depending on the content of the selection / synthesis process, the selection / synthesis A configuration in which the units 215 to 217 are provided in the subsequent stage of each decoder may be adopted.
- the storage 219 is described as being built in, but an external storage may be used.
- the receiving device 20 may employ a configuration having a display and a speaker.
- step S111 when transmitting video data as a component, the video data acquisition unit 111 acquires video data to be transmitted and supplies it to the video encoder 112.
- step S 112 the video encoder 112 encodes the video data supplied from the video data acquisition unit 111 and supplies the encoded video data to the Mux 121.
- step S113 the audio data acquisition unit 113 acquires audio data to be transmitted and supplies the audio data to the audio encoder 114 when transmitting audio data as a component.
- step S ⁇ b> 114 the audio encoder 114 encodes the audio data supplied from the audio data acquisition unit 113 and supplies the encoded audio data to the Mux 121.
- step S115 the caption data acquisition unit 115 acquires the caption data to be transmitted and supplies it to the caption encoder 116 when transmitting caption data as a component.
- step S ⁇ b> 116 the caption encoder 116 encodes the caption data supplied from the caption data acquisition unit 115 and supplies the encoded caption data to the Mux 121.
- step S117 the file data acquisition unit 117 acquires file data such as video, audio, subtitles, NRT content, application, and the like when transmitting data in the file format, and supplies the file data to the file processing unit 118.
- step S ⁇ b> 118 the file processing unit 118 performs predetermined file processing on the file data supplied from the file data acquisition unit 117 and supplies the file data to the Mux 121.
- step S119 the control signal acquisition unit 119 acquires a control signal (signaling information in FIG. 7) and supplies the control signal to the control signal processing unit 120.
- step S ⁇ b> 120 the control signal processing unit 120 performs predetermined signal processing on the control signal supplied from the control signal acquisition unit 119 and supplies the signal to the Mux 121.
- step S121 the Mux 121 controls the video data from the video encoder 112, the audio data from the audio encoder 114, the subtitle data from the subtitle encoder 116, the file data from the file processing unit 118, and the control from the control signal processing unit 120.
- the signals are multiplexed to generate an IP transmission BBP stream and supplied to the transmitter 122.
- step S122 the transmission unit 122 transmits the BBP stream supplied from the Mux 121 as a broadcast signal via the antenna 123. Then, when the process of step S122 ends, the transmission process ends.
- the transmission process has been described above.
- components such as video (Video) and audio (Audio) received by the receiving device 20 are a selective layer (Selective Layer), a composite layer (Composite Layer), and an adaptive layer. It consists of three layers (Adaptive Layer). In this hierarchical structure, a composite layer is disposed as an upper layer of the adaptive layer, and a selective layer is disposed as an upper layer of the composite layer.
- one circular symbol with a different pattern represents a component that is distributed using broadcast (Broadcast Component), and the other symbol is distributed using communication.
- These components are distributed as so-called adaptive streaming, and a plurality of components having different bit rates are prepared.
- a straight line that swings left and right on the dotted arc in the figure functions as a switch, so that one component is selected from a plurality of components.
- the composite layer that is an upper layer of the adaptive layer, a plurality of components adaptively selected by the adaptive layer are combined into one component. That is, the composite layer is a hierarchy for combining a plurality of components in the component group to be combined and functioning as one component (composite component) in each component category.
- the selective layer which is the upper layer of the composite layer and is the highest layer, a straight line that swings left and right on the dotted arc in the figure functions as a switch, so that one component can be selected from multiple components. Is selected. That is, the selective layer is a hierarchy for statically selecting one or a plurality of components from a component group to be fixedly selected in each component category according to a predetermined selection method.
- the control unit 220 controls the tuner 212 based on the bootstrap information of the SCT held in the NVRAM 221 as channel selection information, and broadcasts the broadcast signal of the service to be selected. To be received.
- the control unit 220 collects and analyzes the SCS (MPD, SPT, SDP, USD) of the services to be selected separated by the Demux 213, and extracts the information elements.
- step S212 the control unit 220 specifies the first component category among all the component categories specified in the description of the SPT component.
- video is designated as the first component category.
- step S213 the control unit 220 selects a component by listing the component options of the selective layer with respect to the description of each component of the MPD within the range of the component category specified in the process of step S212.
- the names of component options may be displayed on the screen to allow the user to select them, or the receiving apparatus 20 may automatically select them according to compatible functions and initial settings.
- the selected component is a composite component according to the description of the composite layer of the MPD, a specific component is selected from among the components constituting the composite component.
- step S214 the control unit 220 determines whether or not the component selected in the process of step S213 is the adaptive layer switching target based on the description of the adaptive layer of the MPD.
- step S214 If it is determined in step S214 that the target is an adaptive layer switching target, the process proceeds to step S215.
- step S215 the control unit 220 controls the selection / synthesis units 215 to 217 to select a predetermined default component from among the adaptive component selection targets.
- step S216 If it is determined in step S214 that the target is not an adaptive layer switching target, the process in step S215 is skipped, and the process proceeds to step S216.
- step S216 the control unit 220 refers to the SPT or USD by using a representation ID for commonly identifying a specific component with respect to the component narrowed down to one in the component category being designated, Check whether the component is distributed by broadcast or communication.
- step S216 If it is determined in step S216 that the component is distributed by broadcasting, the process proceeds to step S218.
- the control unit 220 refers to the SPT or USD, so that the component is a component that is broadcast and distributed within the same service or a component that is broadcast and distributed within a different service. Determine if. In this determination process, whether the IP address is the same as that of the acquired SCS may be confirmed, for example, whether it is in the same service, that is, whether it is in-band or out-band. Therefore, if the IP address is different from the acquired SCS, it is a different service.
- step S217 If it is determined in step S217 that the component is a component that is broadcast and distributed within the same service, that is, a component that is broadcast and distributed in-band, the process proceeds to step S218.
- the control unit 220 refers to the IP address, the port number, the TSI obtained by referring to the SDP, and the TOI obtained by referring to the USD from the segment URL specified by the MPD with respect to the component narrowed down to one. To get. Then, the control unit 220 controls the FLUTE processing unit 218 to acquire the component in the same service by specifying the segment specified by the TOI from the FLUTE session specified by the IP address, the port number, and the TSI. can do.
- step S217 If it is determined in step S217 that the component is a component that is broadcast and distributed in a different service, that is, a component that is broadcast and distributed in the out band, the process proceeds to step S219.
- the control unit 220 controls the tuner 212 based on the SCT bootstrap information held in the NVRAM 221 as the channel selection information, and receives a broadcast signal of another service specified by the SPT Associated Service Descriptor. So that In addition, the control unit 220 acquires and analyzes the SDP and USD of another service separated by the Demux 213, and extracts the information element.
- control unit 220 refers to the USD from the IP address, the port number, the TSI obtained by referring to the SDP and the segment URL specified by the MPD of the original service for the component narrowed down to one. Get the resulting TOI. Then, the control unit 220 controls the FLUTE processing unit 218 to specify the segment specified by the TOI from the FLUTE session specified by the IP address, the port number, and the TSI. Can be acquired.
- step S216 If it is determined in step S216 that the component is distributed by communication, the process proceeds to step S220.
- step S220 the control unit 220 acquires the segment URL specified by the MPD for the component narrowed down to one. Then, the control unit 220 controls the communication I / F 222, accesses the distribution server 60 according to the segment URL, and acquires components via the Internet 90.
- step S221 the control unit 220 determines whether or not the component is a composite component based on the description of the composite layer of the MPD.
- step S221 If it is determined in step S221 that the component is a composite component, the process proceeds to step S222.
- step S222 the control unit 220 determines whether or not the component acquisition process of the component of the composite component has been completed. If the component component acquisition process is not completed in step S222, that is, if there are still components to be acquired, the process proceeds to step S223.
- step S223 the control unit 220 selects the next component to be acquired based on the description of the MPD composite layer.
- the process of step S223 ends, the process returns to step S214, and the subsequent processes are repeated. Thereby, the component of the component of a synthetic
- step S224 the control unit 220 controls the operation of each unit and performs the decoding / presentation processing of the composite component.
- the control unit 220 controls the selection / synthesis unit 215, the video decoder 224, and the video output unit 225 to synchronize the components of the components of the synthesis component.
- decoding is performed, and presentation of a video corresponding to the synthesized component is started.
- the control unit 220 controls the selection / synthesis unit 216, the audio decoder 226, and the audio output unit 227 to synchronize the components of the components of the synthesis component. After synthesizing these components, decoding is performed, and output of audio corresponding to the synthesis component is started.
- step S225 the control unit 220 controls the operation of each unit, and performs the decoding / presentation processing of the single component.
- the control unit 220 controls the selection / combination unit 215, the video decoder 224, and the video output unit 225 when the component category being designated is video and a broadcast-distributed component is acquired. Then, the component is decoded and the presentation of the video is started. Further, for example, when the component category being designated is audio, and the component distributed by broadcast is acquired, the control unit 220 selects / combines the unit 216, the audio decoder 226, and the audio output unit 227. Is controlled to decode the component and start outputting the sound.
- step S224 or S225 the process proceeds to step S226.
- the control unit 220 determines whether all the component categories specified in the description of the component of the SPT have been specified and are the last component category.
- step S226 If it is determined in step S226 that it is not the last component category, the process proceeds to step S227.
- step S227 the control unit 220 designates the next component category among all the component categories designated by the description of the SPT component.
- audio is designated as the next component category.
- step S227 the process returns to step S213, and the subsequent processes are repeated. Thereby, the component of the next component category (for example, audio) is acquired and decoded, and presentation (output) is started. If it is determined in the determination process in step S226 that the component category is the last component category, the reception process ends. The reception process has been described above.
- the series of processes described above can be executed by hardware or software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 43 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 905 is further connected to the bus 904.
- An input unit 906, an output unit 907, a recording unit 908, a communication unit 909, and a drive 910 are connected to the input / output interface 905.
- the input unit 906 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 907 includes a display, a speaker, and the like.
- the recording unit 908 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 909 includes a network interface or the like.
- the drive 910 drives a removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 901 loads the program stored in the recording unit 908 to the RAM 903 via the input / output interface 905 and the bus 904 and executes the program, as described above. A series of processing is performed.
- the program executed by the computer 900 can be provided by being recorded on a removable medium 911 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 908 via the input / output interface 905 by installing the removable medium 911 in the drive 910. Further, the program can be received by the communication unit 909 via a wired or wireless transmission medium and installed in the recording unit 908. In addition, the program can be installed in the ROM 902 or the recording unit 908 in advance.
- the program executed by the computer 900 may be a program that is processed in time series in the order described in this specification, or a necessary timing such as when a call is made in parallel. It may be a program in which processing is performed.
- processing steps for describing a program for causing the computer 900 to perform various processes do not necessarily have to be processed in time series in the order described in the flowchart, and may be performed in parallel or individually. (For example, parallel processing or object processing).
- the program may be processed by one computer, or may be processed in a distributed manner by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices. Further, when a plurality of processes are included in one step, the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can take the following structures.
- the receiving device according to (1) wherein a distribution form of the component is broadcast distribution or communication distribution.
- the control signal is transmitted for each service,
- the broadcast distribution includes a first broadcast distribution in which the component is distributed in the same service as the control signal of the selected service, and another service in which the component is different from the control signal of the selected service.
- the receiving device according to (2) including a second broadcast distribution distributed within the network.
- the control signal is composed of a plurality of signaling information, The receiving device according to any one of (1) to (3), wherein the component is identified by a common ID in each signaling information.
- each segment in the FLUTE session stores data of a component of a specific category.
- Each segment in the FLUTE session stores data for multiple categories of components,
- the receiving apparatus according to (5), wherein the control signal includes identification information for identifying a category of the component.
- the control signal includes a table in which parameters relating to at least one of various services and components constituting the service are described as signaling information,
- the receiving apparatus according to any one of (1) to (7), wherein the table describes a component ID and a category of the component as parameters relating to the component.
- the control signal is transmitted in a layer higher than the IP layer among the protocol layers in the IP transmission method,
- the receiving device according to any one of (1) to (8), wherein a common IP address is assigned to the component and the control signal that constitute each service.
- the receiving device is Receive broadcast waves of digital broadcasting using the IP transmission method, Based on the information included in the control signal transmitted by the broadcast wave and indicating the distribution form of the components constituting the various services, the components constituting the selected service are obtained according to the distribution form.
- a receiving method including a step of controlling operation of each unit that performs predetermined processing relating to the component.
- a first acquisition unit that acquires one or more components constituting various services;
- a second acquisition unit that acquires a control signal including information indicating a distribution form of the component;
- a transmission apparatus comprising: a transmission unit that transmits a broadcast wave that uses an IP transmission method and includes the component that constitutes the service and the control signal.
- the transmission device (12) The transmission device according to (11), wherein a distribution form of the component is broadcast distribution or communication distribution. (13) The control signal is transmitted for each service, The broadcast distribution includes a first broadcast distribution in which the component is distributed in the same service as the control signal of the selected service, and another service in which the component is different from the control signal of the selected service. The transmission device according to (12), including a second broadcast distribution distributed within the network. (14) The control signal is composed of a plurality of signaling information, The transmission component according to any one of (11) to (13), wherein the component is identified by a common ID in each signaling information. (15) The transmission device according to any one of (11) to (14), wherein the component has a file format and is transmitted by a FLUTE session.
- each segment in the FLUTE session stores data of a component of a specific category.
- Each segment in the FLUTE session stores data for multiple categories of components,
- the control signal includes a table in which parameters relating to at least one of various services and components constituting the service are described as signaling information,
- the control signal is transmitted in a layer higher than the IP layer among the protocol layers in the IP transmission method,
- the transmission apparatus according to any one of (11) to (18), wherein a common IP address is assigned to the component and the control signal that constitute each service.
- the transmitting device is Obtain one or more components that make up various services, Obtaining a control signal including information indicating a distribution form of the component;
- a transmission method including a step of transmitting a broadcast wave using an IP transmission method and including the component constituting the service and the control signal.
- 1 broadcast communication cooperation system 10 transmitting device, 20 receiving device, 60 distribution server, 90 internet, 111 video data acquisition unit, 113 audio data acquisition unit, 115 subtitle data acquisition unit, 117 file data acquisition unit, 119 control signal acquisition unit , 122 transmission unit, 212 tuner, 220 control unit, 222 communication I / F, 900 computer, 901 CPU
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
La présente invention concerne un dispositif de réception, un procédé de réception, un dispositif de transmission et un procédé de transmission conçus pour s'adapter de manière flexible à différents modes de fonctionnement dans une diffusion numérique pour laquelle un système de transmission IP est introduit. L'invention concerne un dispositif de réception équipé des éléments suivants : une unité de réception permettant de recevoir une onde de diffusion d'une diffusion numérique pour laquelle un système de transmission IP a été utilisé ; et une unité de commande permettant d'acquérir, sur la base des informations comprises dans un signal de commande transmis par l'onde de diffusion et indiquant un état dans lequel des composants constituant différents services sont distribués, des composants constituant un service sélectionné en fonction de l'état dans lequel les composants sont distribués, et commandant l'opération de chaque unité qui effectue un processus prescrit appartenant aux composants acquis. La présente invention peut être appliquée, par exemple, à un récepteur de télévision.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014018744 | 2014-02-03 | ||
| JP2014-018744 | 2014-02-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015115253A1 true WO2015115253A1 (fr) | 2015-08-06 |
Family
ID=53756830
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/051443 Ceased WO2015115253A1 (fr) | 2014-02-03 | 2015-01-21 | Dispositif de réception, procédé de réception, dispositif de transmission et procédé de transmission |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2015115253A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017517181A (ja) * | 2014-04-09 | 2017-06-22 | エルジー エレクトロニクス インコーポレイティド | 放送伝送装置、放送受信装置、放送伝送装置の動作方法及び放送受信装置の動作方法 |
| JP2017518662A (ja) * | 2014-06-25 | 2017-07-06 | エルジー エレクトロニクス インコーポレイティド | 放送信号送信装置、放送信号受信装置、放送信号送信方法、及び放送信号受信方法 |
-
2015
- 2015-01-21 WO PCT/JP2015/051443 patent/WO2015115253A1/fr not_active Ceased
Non-Patent Citations (1)
| Title |
|---|
| "Enhanced MBMS Operation", 3GPP TR 26.848 V0.7.0 (2014-01, 24 January 2014 (2014-01-24), pages 8 - 13, XP055217164, Retrieved from the Internet <URL:http://www.3gpp.org/ftp/TSG_SA/WG4_CODEC/TSGS4_77/Docs/S4-140218.zip(TR26848v0.7.0-EMO-cl.doc> [retrieved on 20150414] * |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017517181A (ja) * | 2014-04-09 | 2017-06-22 | エルジー エレクトロニクス インコーポレイティド | 放送伝送装置、放送受信装置、放送伝送装置の動作方法及び放送受信装置の動作方法 |
| US10694259B2 (en) | 2014-04-09 | 2020-06-23 | Lg Electronics Inc. | Broadcast transmission device, broadcast reception device, operating method of broadcast transmission device, and operating method of broadcast reception device |
| US11166083B2 (en) | 2014-04-09 | 2021-11-02 | Lg Electronics Inc. | Broadcast transmission device, broadcast reception device, operating method of broadcast transmission device, and operating method of broadcast reception device |
| JP2017518662A (ja) * | 2014-06-25 | 2017-07-06 | エルジー エレクトロニクス インコーポレイティド | 放送信号送信装置、放送信号受信装置、放送信号送信方法、及び放送信号受信方法 |
| US10158678B2 (en) | 2014-06-25 | 2018-12-18 | Lg Electronics Inc. | Broadcast signal transmission device, broadcast signal receiving device, broadcast signal transmission method and broadcast signal receiving method |
| US10880339B2 (en) | 2014-06-25 | 2020-12-29 | Lg Electronics Inc. | Broadcast signal transmission device, broadcast signal receiving device, broadcast signal transmission method and broadcast signal receiving method |
| US11323490B2 (en) | 2014-06-25 | 2022-05-03 | Lg Electronics Inc. | Broadcast signal transmission device, broadcast signal receiving device, broadcast signal transmission method and broadcast signal receiving method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10129308B2 (en) | Session description information for over-the-air broadcast media data | |
| US10623827B2 (en) | Receiving device, receiving method, transmitting device, and transmitting method | |
| US11374993B2 (en) | Reception device, reception method, transmission device, and transmission method | |
| JP7663850B2 (ja) | 受信方法、及び、送信方法 | |
| JP6743704B2 (ja) | 送信装置、送信方法、受信装置および受信方法 | |
| JP6643246B2 (ja) | 受信装置、受信方法、送信装置、及び、送信方法 | |
| US20190342862A1 (en) | Reception apparatus, reception method, transmission apparatus, and transmission method | |
| US20200336526A1 (en) | Reception device, reception method, transmission device, and transmission method for distributing signaling information | |
| WO2015060148A1 (fr) | Appareil de réception, procédé de réception, appareil de transmission et procédé de transmission | |
| US11622088B2 (en) | Reception apparatus, transmission apparatus, and data processing method | |
| US20190342110A1 (en) | Reception apparatus, reception method, transmission apparatus, and transmission method | |
| WO2015115253A1 (fr) | Dispositif de réception, procédé de réception, dispositif de transmission et procédé de transmission | |
| WO2015178221A1 (fr) | Dispositif de réception, procédé de réception, dispositif d'émission et procédé d'émission | |
| JP2020025325A (ja) | 受信装置、送信装置、及び、データ処理方法 | |
| WO2015107930A1 (fr) | Dispositif de réception, procédé de réception, dispositif de transmission et procédé de transmission |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15743898 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15743898 Country of ref document: EP Kind code of ref document: A1 |