WO2010041905A2 - 수신 시스템 및 데이터 처리 방법 - Google Patents
수신 시스템 및 데이터 처리 방법 Download PDFInfo
- Publication number
- WO2010041905A2 WO2010041905A2 PCT/KR2009/005800 KR2009005800W WO2010041905A2 WO 2010041905 A2 WO2010041905 A2 WO 2010041905A2 KR 2009005800 W KR2009005800 W KR 2009005800W WO 2010041905 A2 WO2010041905 A2 WO 2010041905A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- field indicating
- received
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234327—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4345—Extraction or processing of SI, e.g. extracting service information from an MPEG stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
- H04N21/4382—Demodulation or channel decoding, e.g. QPSK demodulation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4621—Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/02—Handling of images in compressed format, e.g. JPEG, MPEG
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/003—Aspects relating to the "2D+depth" image format
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/007—Aspects relating to detection of stereoscopic image format, e.g. for adaptation to the display format
Definitions
- the present invention relates to an apparatus and method for processing an image signal, and more particularly, to a receiving system and a data processing method for receiving and processing a three-dimensional image signal.
- three-dimensional (3D) image (or stereoscopic image) is based on the stereo vision principle of two eyes.
- the parallax of two eyes that is, the binocular parallax due to the distance between two eyes that are about 65mm apart, is an important factor to make a sense of three-dimensional feeling.
- Such 3D image display is largely classified into a stereoscopic method, a volumetric method, and a holographic method.
- An object of the present invention is to provide a receiving system and a data processing method for enabling the receiving system to recognize and process the reception of 3D images.
- Another object of the present invention is to provide a receiving system and a data processing method for providing additional information related to a received 3D image to a display device connected to a digital interface by being included in system information.
- a receiving system includes an image receiving unit and a display unit.
- the image receiver may be a decoding device (or HDMI source), and the display unit may be a display device (or HDMI sink).
- the image receiving unit receives system information including a 3D image and additional information of the 3D image, generates 3D signaling information based on the 3D image additional information included in the system information, and then together with the 3D image. Transmit via digital interface.
- the display unit receives the 3D image and the 3D signaling information through the digital interface, and formats and displays the 3D image based on the received 3D signaling information.
- the 3D image additional information is received in the form of a descriptor in the program map table (PMT) of the system information.
- PMT program map table
- the 3D image additional information indicates a field indicating whether a 3D image is received, a field indicating a transmission format of the 3D image, and indicating whether the leftmost top pixel in the received frame belongs to the left image or the right image.
- Field a field indicating whether at least one of the left image and the right image has been inversely scanned and coded, a field indicating which image of the left image and the right image is inversely scanned, and at least one of the left image and the right image It includes a field indicating whether is sampled using a filter.
- the video receiver inserts 3D signaling information generated from the 3D video additional information into an AVI InfoFrame packet and transmits the same.
- the AVI InfoFrame packet is composed of a header and a content area, and one or more fields are allocated to at least one byte of the content area to record the 3D signaling information.
- At least one byte of the content area of the AVI InfoFrame packet includes a field indicating whether a 3D image is received, a field indicating a transmission format of the 3D image, whether the leftmost top pixel in the received frame belongs to the left image, A field indicating whether the image belongs to the right image, a field indicating which of the left image and the right image are inversely scanned, and a field indicating whether at least one of the left image and the right image has been sampled using a filter. do.
- a data processing method of a reception system receives system information including 3D (dimensions) image and the additional information of the 3D image, and based on the 3D image additional information included in the system information Generating 3D signaling information and transmitting the 3D signaling information through a digital interface; And receiving the 3D image and the 3D signaling information through the digital interface, and formatting and displaying the 3D image based on the received 3D signaling information.
- the present invention obtains 3D video additional information from the PMT received by the decoding device, and 3D signaling from the obtained 3D video additional information.
- the information is generated, inserted into an AVI InfoFrame packet and output to the display device via a digital interface cable. By doing so, it is possible to accurately restore and display the 3D image on the display device.
- FIG. 1 illustrates examples of a single video stream format among 3D video transmission formats according to the present invention.
- FIG. 2 is a diagram illustrating examples of a multi video stream format among 3D video transmission formats according to the present invention.
- FIG. 3 is a diagram illustrating an exemplary embodiment of a PMT syntax structure in which identification information for recognizing reception of a 3D image is included in a descriptor form according to the present invention.
- FIG. 4 illustrates an embodiment of a syntax structure of a stereoscopic video format descriptor according to the present invention.
- FIG. 5 is a schematic diagram of a 3D imaging system according to an embodiment of the present invention.
- FIG. 6 is a block diagram illustrating an example in which a decoding device and a display device in a receiving system are connected through an HDMI cable according to the present invention.
- FIG. 7 is a table showing examples of various packet types used in the HDMI standard according to the present invention.
- FIG. 8 illustrates an embodiment of a header structure of an AVI InfoFrame packet according to the present invention.
- FIG. 9 illustrates an embodiment of a content structure of a general AVI InfoFrame packet.
- FIG. 10 illustrates an embodiment of a content structure of an AVI InfoFrame packet according to the present invention.
- FIG. 11 are tables showing the meanings of values assigned to each field of the fifteenth byte of AVI InfoFrame packet contents according to an embodiment of the present invention.
- FIG. 12 is a flowchart illustrating an embodiment of a process of generating and transmitting 3D signaling information according to the present invention.
- FIG. 13 is a flowchart illustrating an embodiment of a process of processing video data and displaying a 3D image with reference to 3D signaling information according to the present invention.
- the present invention is to enable the reception system to recognize and process the reception of the 3D image.
- the transmission system of the present invention includes additional information of a 3D image in system information and transmits the additional information.
- the present invention is to enable the receiving system to decode the 3D image using the additional information of the 3D image received in the system information.
- the present invention is to be able to generate the 3D signaling information from the additional information of the received 3D image included in the system information to provide to the display device connected to the digital interface.
- the present invention allows a display device to process and display a 3D image based on the 3D signaling information provided through the digital interface.
- the 3D image may include a stereo (or stereoscopic) image considering two viewpoints and a multiview image considering three or more viewpoints.
- the stereo image refers to a pair of left and right images obtained by photographing the same subject with a left camera and a right camera spaced apart from each other by a certain distance.
- the multi-view image refers to three or more images obtained by photographing the same subject with three or more cameras having a constant distance or angle.
- the transmission format of the stereo video includes a single video stream format and a multi video stream format.
- the single video stream format includes a side by side of FIG. 1A, a top / bottom of FIG. 1B, and an interlaced portion of FIG. 1C. ), A frame sequential of FIG. 1D, a checker board of FIG. 1E, an anaglyph of FIG. 1F, and the like.
- the left and right images are each half subsampled in the horizontal direction, and the sampled left image is positioned on the left side and the sampled right image is positioned on the right side.
- one stereo image is created.
- the top / bottom format of FIG. 1 (b) one stereo image is placed by sub-sampling the left image and the right image in the vertical direction, and the sampled left image is placed at the top and the sampled right image is placed at the bottom. If it is created.
- the left and right images are each half subsampled in the vertical direction, and the pixels of the sampled left image and the pixels of the right image are alternately positioned on a line-by-line basis to generate a stereo image.
- a stereo image is created by sub-sampling the left image and the right image 1/2 in the horizontal direction, and placing the pixels of the sampled left image and the pixels of the right image alternately one pixel.
- the left image and the right image are 1/2 subsampled in the vertical and horizontal directions, respectively, and the pixels of the sampled left image and the pixel of the right image are alternately positioned by one pixel. This is the case when a stereo image is made.
- the full left / right format of FIG. 2 (a) is a case in which left and right images are sequentially transmitted, respectively.
- the full left / half right format of FIG. This is the case when 1/2 subsampling is transmitted in the vertical or horizontal direction.
- the 2D video / depth format of FIG. 2C is a case in which one of the left image and the right image is transmitted together with depth information for generating another image.
- the receiving system is a system capable of processing the 3D image should be able to recognize the reception of the 3D image.
- 3D video has various transmission formats, in order to reconstruct the original video in a receiving system, it is necessary to know a transmission format of a received 3D video.
- the 3D video additional information is included in system information and transmitted.
- the reception system of the present invention decodes and displays the 3D image using the 3D image additional information received and included in the system information.
- the system information is sometimes called service information in some cases.
- the system information may include channel information, program information, event information, and the like.
- PSI / PSIP Program Specific Information / Program and System Information Protocol
- PSIP Program Specific Information / Program and System Information Protocol
- a protocol for transmitting system information in a table format may be applicable to the present invention regardless of its name.
- the PSI is a system standard of MPEG-2 defined for classifying channels and programs
- the PSIP is an Advanced Television Systems Committee (ATSC) standard for classifying channels and programs.
- ATSC Advanced Television Systems Committee
- the PSI may include, as an embodiment, a Program Association Table (PAT), a Conditional Access Table (CAT), a Program Map Table (PMT), and a Network Information Table (NIT).
- PAT Program Association Table
- CAT Conditional Access Table
- PMT Program Map Table
- NIT Network Information Table
- the PAT is special information transmitted by a packet having a PID of '0', and transmits PID information of a corresponding PMT and PID information of a NIT for each program.
- the CAT transmits information about the pay broadcasting system used on the transmitting side.
- the PMT transmits the program identification number, PID information of a transport stream packet to which individual bit streams such as video and audio constituting the program are transmitted, and PID information to which PCR is transmitted.
- the NIT transmits information of an actual transmission network. For example, a PAT table with a PID of 0 is parsed to find a program number and a PID of a PMT. When the PMT obtained from the PAT is parsed, the correlation between the components constituting the program can be known.
- the PISP includes a virtual channel table (VCT), a system time table (STT), a rating region table (RTT), an extended text table (ETT), a direct channel change table (DCCT), and a direct channel change selection (DCCSCT).
- VCT virtual channel table
- STT system time table
- RTT rating region table
- ETT extended text table
- DCCT direct channel change table
- DCCSCT direct channel change selection
- a code table, an event information table (EIT), and a master guide table (MGT) may be included.
- the VCT transmits information about a virtual channel, for example, channel information for channel selection and packet identifier (PID) for receiving audio and / or video.
- PID packet identifier
- the STT transmits current date and time information
- the RRT transmits information about a region and a review institution for a program grade.
- the ETT transmits an additional description of a channel and a broadcast program
- the EIT transmits information (eg, title, start time, etc.) about an event of a virtual channel.
- the DCCT / DCCSCT transmits information related to automatic channel change
- the MGT transmits version and PID information of respective tables in the PSIP.
- the 3D image additional information is received in the form of one or more descriptors or fields in the system information.
- the 3D image additional information is received in the form of a descriptor in the PMT.
- the 3D image additional information is received in a VCT form in the system information.
- the 3D image additional information may include at least one information related to a 3D image.
- the 3D image additional information may include information indicating a transmission format of a 3D image, information indicating whether a received image is a 2D image or a 3D image, whether the leftmost top pixel in the received image frame belongs to a left image, or a right image.
- One may include information indicating whether one has been sampled using a filter.
- FIG. 3 illustrates an embodiment of a PMT syntax structure in which 3D image additional information is included in a descriptor form according to the present invention.
- the table_id field is a table identifier, and an identifier for identifying the PMT may be set.
- the section_syntax_indicator field is an indicator that defines the section format of the PMT.
- the section_length field represents a section length of the PMT.
- the program_number field indicates information of a program as information corresponding to the PAT.
- the version_number field represents a version number of the PMT.
- the current_next_indicator field is an identifier indicating whether the current table section is applicable.
- the section_number field indicates the section number of the current PMT section when the PMT is transmitted divided into one or more sections.
- the last_section_number field represents the last section number of the PMT.
- the PCR_PID field indicates a PID of a packet carrying a program clock reference (PCR) of a current program.
- the program_info_length field represents descriptor length information immediately following the program_info_length field in number of bytes. That is, the length of the descriptors included in the first loop.
- the stream_type field indicates the type and encoding information of the element stream included in the packet having the PID value indicated by the following elementary_PID field. For example, if the corresponding stream is an MPEG-2 encoded video stream, 0x02 is indicated in the stream_type field value according to an embodiment.
- the elementary_PID field represents an identifier of the element stream (ES), that is, a PID value of a packet including the corresponding element stream. For example, if the stream_type field value is 0x02, the elementary_PID field indicates the PID of the MPEG-2 encoded video ES.
- the ES_Info_length field represents descriptor length information immediately after the ES_Info_length field in number of bytes. That is, the length of the descriptors included in the second loop.
- Program level descriptors are included in the descriptor () region of the first loop of the PMT, and stream level descriptors are included in the descriptor () region of the second loop. That is, the descriptors included in the first loop are descriptors applied to each program individually, and the descriptors included in the second loop are descriptors applied to each ES individually.
- a program corresponding to the program_number field value of the PMT is a 3D image, that is, 3D content
- 3D image additional information is included in a descriptor form in a descriptor () region of the second loop of the PMT.
- this descriptor will be referred to as a stereoscopic video format descriptor stereoscopic_video_format_descriptor ().
- the stereoscopic video format descriptor stereoscopic_video_format_descriptor () is included after the ES_Info_length field when the elementary_PID field value indicates the PID of the video ES.
- the stereoscopic video format descriptor stereoscopic_video_format_descriptor () may be included in the first loop of the PMT.
- the stereoscopic video format descriptor is parsed to obtain 3D image additional information.
- FIG. 4 is an embodiment of a syntax structure of a stereoscopic video format descriptor stereoscopic_video_format_descriptor () according to the present invention.
- the descriptor_tag field is assigned with 8 bits, and indicates that the corresponding descriptor is stereoscopic_video_format_descriptor ().
- the descriptor_length field is allocated 8 bits and represents a byte length from this field to the end of the descriptor.
- the service_type field is allocated 8 bits, and indicates whether the video ES indicated by the stream_type field is a video ES of 2D video or a video ES of 3D video. According to an embodiment of the present invention, 0 is displayed in the service_type field value if the video ES of the 2D image, and 1 is displayed in the service_type field value if the video ES of the 3D image.
- the composition_type field is allocated 8 bits, and indicates in which transmission format the 3D video is received.
- composition_type field value When the composition_type field value is parsed, the corresponding 3D image is side by side, top / bottom, interlaced, frame sequential, checker board. Know which transmission format is received, Anaglyph, Full left / right, Full left / Half right, or 2D video / depth. Can be. For example, if the composition_type field value is 0x01, the reception system determines that the 3D image is received in a side by side format.
- the LR_first_flag field is allocated 1 bit, and when a stereo image is generated (or multiplexed), the leftmost top pixel in the frame belongs to the left image, or the right image. Indicate whether it belongs. That is, it indicates whether the leftmost top pixel of the received frame is displayed as the left image or the right image.
- the LR_first_flag field value is '0', the leftmost top pixel in the frame is displayed as the left image, and when '1', the leftmost top pixel is displayed as the right image.
- the receiving system decodes and displays the left half pixel as a left image in one frame, and the right half pixel is decoded. The right image is displayed.
- the transmission format is a side-by-side format and the LR_first_flag field value is '1'
- the receiving system decodes the left half of the pixel and displays it as the right image in one frame, and decodes the right half of the pixel to the left image. Display.
- the receiving system decodes and displays the upper half of the pixels as the left image in one frame, and the lower half of the pixels is decoded to the right. Display as an image.
- the transmission format is the top / bottom format and the LR_first_flag field value is '1'
- the receiving system decodes and displays the upper half of the pixels as the right image and decodes the lower half of the left image in one frame. Display.
- the spatial_flipping_flag field is allocated with 1 bit, and indicates whether at least one of the right image and the left image is scanned backward and coded.
- the scanning direction of at least one image may be reversed for coding efficiency. That is, the left or right image may be coded by performing a reverse scan (or arrangement) in the vertical or horizontal direction according to coding efficiency.
- the present invention will be referred to as a mirrored image.
- a reverse scan is coded by performing a horizontal scan on a left or right image, and when a transmission format is a top / bottom format, a reverse scan is performed vertically on a left or right image. Coding by using one embodiment.
- the spatial_flipping_flag field value is represented by "1" according to an embodiment. If the spatial_flipping_flag field value is '1', the receiving system displays the rearranged images in the reverse scanning order before displaying the mirrored image. If the spatial_flipping_flag field is '0', it means that pixels of the left or right image are arranged and coded in the original scanning order.
- the image0_flipped_flag field is allocated one bit, and when the spatial_flipping_flag field value is '1', it indicates which image is flipped or mirrored or inverted. According to an embodiment of the present invention, when image0 is flipped, the image0_flipped_flag field value is displayed as '1', and when image1 is flipped, the image0_flipped_flag field value is displayed as '0'.
- image0 is an image to which the leftmost top pixel in one frame composed of left and right images belongs
- image1 is another image. That is, the mapping relationship between the left image and the right image is set in the image0 and image1 by the LR_first_flag field. If the value of the LR_first_flag field is '0', the left image corresponds to image0 and the right image corresponds to image1.
- the quincunx_filtering_flag field is allocated with 1 bit, and indicates whether the quincunx filter is sampled using a quincunx filter while generating a stereo image.
- the quincunx_filtering_flag field value indicates '1', otherwise '0'. In one embodiment, display is performed. If the quincunx_filtering_flag field value is 1, the receiving system performs a reverse process of quincunx filtering on the corresponding video.
- the quincunx_filtering_flag field value is indicated by '1'.
- a filter other than a quincunx filter is used when sub-sampling the left or right image 1/2 in the horizontal or vertical direction.
- the present invention may further include a field indicating the type of filter.
- the 3D image may be decoded and displayed with reference to other fields in the stereoscopic video format descriptor.
- the order, position, and meaning of the fields allocated to the stereoscopic video format descriptor stereoscopic_video_format_descriptor () shown in FIG. 4 are merely an example for the understanding of the present invention, and the fields assigned to the stereoscopic video format descriptor stereoscopic_video_format_descriptor ().
- the order, position, meaning, and number of additionally allocated fields may be easily changed by those skilled in the art, and thus the present invention will not be limited to the above embodiment.
- the 3D video additional information according to the present invention may be included in the virtual channel table (VCT) and received.
- VCT virtual channel table
- FIG. 5 is a block diagram illustrating an embodiment of a 3D imaging system according to the present invention and may include a content source 100, a decoding device 200, and a display device 300.
- the present invention will be referred to as a receiving system by integrating the decoding device 200 and the display device 300 for convenience of description.
- the content source 100 includes 3D content for 3D video, and may be, for example, a disk, a server, a terrestrial / satellite / cable broadcasting station, and the like.
- the decoding device 200 generates an image for display by receiving and decoding content from the content source 100. For example, if the received content is compression coded, decompression and / or interpolation are performed to reconstruct the image before compression coding.
- the decoding device 200 may be a DVD player, a settop box, a digital TV, or the like.
- the display device 300 displays an image made by the decoding device 200 in a two-dimensional or three-dimensional form.
- the display device 300 may be a screen, a monitor, a projector, or the like.
- the display device 300 may be a device capable of displaying a general 2D image, a device capable of displaying a 3D image requiring glasses, a device capable of displaying a 3D image not requiring glasses, and the like. have.
- the reception system generates and displays 3D images in various ways using at least two images according to the characteristics of the display device 300.
- the display method may be, for example, a method of wearing glasses and a glasses-free method of not wearing glasses.
- the way of wearing glasses is divided into passive and active.
- the passive method distinguishes between a left image and a right image using a polarization filter. That is, the passive method is a method of wearing blue and red sunglasses, respectively, in both eyes.
- the active method distinguishes the left image from the right image by sequentially covering the left eye and the right eye in time. In other words, the active method is a method of periodically repeating a time-divided screen and wearing and viewing glasses with an electronic shutter synchronized to the period, and may also be referred to as a time split type or shuttered glass method. .
- glasses-free glasses-free methods include a lenticular method in which a lenticular lens plate in which cylindrical lens arrays are vertically arranged in front of the image panel, and a barrier layer having periodic slits on the image panel.
- a parallax barrier (parallax barrier) method having a.
- the decoding device 200 and the display device 300 of the receiving system may be implemented separately or may be integrally implemented.
- the decoding device 200 and the display device 300 transmit / receive data using a digital interface.
- the digital interface may be a digital visual interface (DVI), a high definition multimedia interface (HDMI), or the like.
- DVI digital visual interface
- HDMI high definition multimedia interface
- the present invention describes using an HDMI of the digital interface as an embodiment.
- the decoding device 200 and the display device 300 are connected by an HDMI cable.
- the HDMI transmits an uncompressed digital video signal and a digital audio signal with a bandwidth of 5 Gbps or more.
- FIG. 6 is a block diagram illustrating an example in which the decoding device 200 and the display device 300 of the receiving system according to the present invention are connected through an HDMI cable 400.
- the decoding device 200 When the decoding device 200 and the display device 300 are connected through an HDMI cable, the decoding device 200 is called an HDMI source and the display device 300 is called an HDMI sink.
- the HDMI source is a set-top box.
- the decoding device 200 may be referred to as an image receiving unit, and the display device 300 may be referred to as a display unit.
- the HDMI source 200 includes a demultiplexer 201, a video decoder 202, a data decoder 203, and an HDMI transmitter 204.
- the HDMI sink 300 includes an HDMI receiver 301, a 3D formatter 302, and a display engine 303.
- the present invention assumes that the MPEG-2 transport stream (TS) packet received from the content source 100 and demodulated is input to the demultiplexer 201.
- the demodulated TS packet may be a TS packet of a 2D image or a TS packet of a 3D image.
- the demultiplexer 201 receives a TS packet and performs demultiplexing.
- the TS packet is composed of a header and a payload, the header includes a PID, and the payload includes one of a video stream, an audio stream, and a data stream.
- the demultiplexer 201 separates a video stream, an audio stream, and a data stream from the TS packet by using a PID of an input TS packet.
- the separated video stream is output to video decoder 202 and the data stream contains system information and output to data decoder 203.
- the separated audio stream is output to an audio decoder, which is not a feature of the present invention and thus detailed description thereof is omitted.
- the video decoder 202 decodes the video stream using a predetermined video decoding algorithm to restore the video stream before compression.
- the video decoding algorithm includes an MPEG-2 video decoding algorithm, an MPEG 4 video decoding algorithm, an H.264 decoding algorithm, an SVC decoding algorithm, and a VC-1 decoding algorithm. Since the present invention assumes that the video stream is MPEG-2 compression encoded, the video decoder 202 uses an MPEG-2 video decoding algorithm.
- the video stream decoded by the video decoder 202 is output to the HDMI transmitter 204.
- the data decoder 203 classifies the tables using the table id and the section length of the system information, and then parses the sections of the divided tables and database the parsing results on the storage device, or outputs them to the HDMI transmitter 204. do.
- the data decoder 203 forms a table by collecting sections having the same table identifier table_id.
- the data decoder 203 parses the stereoscopic video format descriptor from the PMT among the tables separated from the system information, and outputs the result to the HDMI transmitter 204.
- the HDMI transmitter 204 receives the decoded video stream, encodes a transition minimized differential signaling interface (TMDS), and outputs the encoded video stream to the HDMI receiver 301 of the HDMI sink 300 through the HDMI cable 400.
- TMDS transition minimized differential signaling interface
- the HDMI transmitter 204 generates (or converts) 3D signaling information (or 3D signaling data) using 3D video additional information obtained from the stereoscopic video format descriptor, and then uses an HDMI sink 400 to perform an HDMI sink.
- the TMDS channel is used to transmit video, audio, and auxiliary data.
- the HDMI transmitter 204 uses a packet structure to transmit auxiliary data.
- FIG. 7 is a table showing examples of various packet types used in the HDMI standard according to the present invention.
- the 3D signaling information according to the present invention is included in the AVI InfoFrame packet and transmitted to the HDMI sink 300.
- the link of the HDMI is largely divided into a video data period, a data island period, and a control period. Can be divided into
- the active pixel of the active video line is transmitted during the video data period.
- audio and auxiliary data are transmitted in a series of packets. Control periods are used when video, audio and auxiliary data do not need to be transmitted.
- the HDMI transmitter 204 of the HDMI source 200 outputs an AVI InfoFrame packet including 3D signaling information to the HDMI receiver 301 of the HDMI sink 300 through a data island period.
- the AVI InfoFrame packet is composed of a header area and a content area.
- FIG. 8 shows an embodiment of a header structure of an AVI InfoFrame packet according to the present invention, and is an example composed of 3 bytes.
- the first byte HB0 indicates the packet type
- the second byte HB1 indicates the version information
- the lower five bits of the third byte HB2 indicate the content length of the AVI InfoFrame packet in bytes. do.
- 0x82 is indicated as a packet type value.
- FIG. 9 illustrates an embodiment of a content structure of a general AVI InfoFrame packet, in which the third byte (HB2) in the header of the AVI InfoFrame packet is displayed with a content length value of 0x0D (ie, 13 in decimal).
- Additional video information is included in the first byte PB0 to the fourteenth bytes PB0 to PB13 of the contents of the AVI InfoFrame packet of FIG. 9.
- bar information is recorded from the seventh byte to the fourteenth bytes PB6 to PB13.
- the 15th to 28th bytes PB14 to PM27 are unused areas.
- 3D signaling information is recorded by using one of unused bytes of the AVI InfoFrame packet content as shown in FIG. 9.
- 3D signaling information is recorded using the fifteenth byte PB14 of the AVI InfoFrame packet content.
- the content length value in the AVI InfoFrame packet header of FIG. 8 is changed to 0x0E (that is, 14 in decimal).
- FIG. 10 illustrates an embodiment of a content structure of an AVI InfoFrame packet according to the present invention.
- additional video information is displayed from the first byte PB0 to the fourteenth bytes PB0 to PB13 of the content of the AVI InfoFrame packet, and the 3D signaling information according to the present invention is displayed on the fifteenth byte PB14. (Recorded).
- the 16th to 28th bytes PB15 to PM27 are areas that are not used for future use.
- the 3D signaling information displayed in the fifteenth byte PB14 is generated based on 3D image additional information obtained from the stereoscopic video format descriptor included in the PMT.
- the HDMI transmitter 204 has a 1-bit SV field, a 3-bit CT field (CT2 to CT0), a 1-bit OR field, a 2-bit FL field (FL1, FL0), and a 15-bit (PB14) region.
- 3D signaling information is indicated by allocating a QS field of 1 bit.
- 3D signaling information is generated using 3D image additional information obtained from the service_type field, composition_type field, LR_first_flag field, spatial_flipping_flag field, image0_flipped_flag field, and quincunx_filtering_flag field of the stereoscopic video format descriptor, and the generated 3D signaling information is AVI.
- the fifteenth byte PB14 of the content of the InfoFrame packet is displayed.
- the HDMI transmitter 204 displays the information obtained from the service_type field in the SV field, the information obtained from the composition_type field in the CT field, and the information obtained from the LR_first_flag field in the OR field. do.
- Information obtained from the spatial_flipping_flag field and the image0_flipped_flag field is displayed in the FL field, and information obtained from the quincunx_filtering_flag field is displayed in the QS field.
- 11A to 11E are tables illustrating definitions of values assigned to each field of the fifteenth byte PB14 according to an embodiment of the present invention.
- FIG. 11A illustrates an example of an SV field.
- the service_type field value indicates a 2D image
- 0 is displayed in the SV field
- 1 is displayed when the 3D image is indicated.
- FIG. 11B illustrates an example of a CT field.
- a transmission format obtained from a composition_type field is displayed in the CT field. For example, if the transmission format is the top / bottom format, 000 is displayed in the CT field, and if the side-by-side format, 001 is displayed in the CT field.
- FIG. 11C illustrates an example of an OR field.
- the LR_first_flag field indicates that the leftmost pixel in the frame belongs to a left image
- 0 is displayed in the OR field. If belonging to the image is indicated, 1 is displayed in the OR field.
- FIG. 11 (d) shows an example of an FL field, and sets a FL field value using the spatial_flipping_flag field and the image0_flipped_flag field value.
- the upper bit FL1 of the FL field is represented as 1. If the spatial_flipping_flag field value is 1, that is, at least one of the right picture and the left picture is inversely scanned and coded, the upper bit FL1 of the FL field indicates 0. At this time, if image1 is flipped, 0 is displayed in the lower bit FL0 of the FL field. If image0 is flipped, 1 is displayed in the lower bit FL0 of the FL field.
- 11 (e) shows an example of a QS field. If the value of the quincunx_filtering_flag field does not indicate the use of a filter, 0 is displayed in the QS field, and 1 is displayed in the QS field if the use of a filter is indicated. In one embodiment.
- the AVI InfoFrame packet in which the 3D signaling information is recorded is transmitted to the HDMI receiver 301 of the HDMI sink 300 through the HDMI cable 400 together with the decoded video data.
- the HDMI receiver 301 outputs the received video data to the 3D formatter 302 and parses each field of the fifteenth byte PB14 in the received AVI InfoFrame packet content to obtain 3D signaling information.
- the obtained 3D signaling information is output to the 3D formatter 302 and the display engine 303. That is, parsing each field of the fifteenth byte PB14 in the content of the AVI InfoFrame packet indicates whether the video data received through the HDMI cable 400 is a 2D video or a 3D video. It can be seen.
- the 3D formatter 302 re-formats the video data output from the HDMI receiver 301 according to the display format of the HDMI sink 300 with reference to 3D signaling information output from the HDMI receiver 301. format) and output to the display engine 303.
- the display engine 303 displays a 3D image output from the 3D formatter 302 according to a display method. That is, the display engine 303 creates and displays 3D images in various ways using the left image and the right image according to the display characteristics of the HDMI sink.
- the display method includes a method of wearing glasses and a glasses-free method of not wearing glasses.
- the SV field value acquired from the fifteenth byte PB14 of the AVI InfoFrame packet content indicates a 3D image
- the CT field value indicates a side-by-side format
- the OR field value is '0'
- the QS field value indicates '1'.
- the leftmost top pixel in the received video data frame belongs to the left image
- the right image is inversely scanned during coding
- a quincunx filter is used to sample the left and right images.
- the 3D formatter 302 scans and decodes the right image of the video data in the reverse direction.
- the 3D formatter 302 restores the image to the original image size through a reverse process of the quincunx filter or an appropriate reverse filtering process.
- the display engine 303 displays an image in which the left half of the pixel is decoded and reconstructed as a left image, and displays an image in which the right half of the pixel is decoded and reconstructed as a right image.
- FIG. 12 is a flowchart illustrating an embodiment of a process of acquiring 3D video additional information from PMT in the HDMI source 200 to generate 3D signaling information and transmitting the same through an HDMI cable.
- 3D image additional information is obtained from the stereoscopic video format descriptor of the PMT. That is, 3D image additional information is obtained from a service_type field, a composition_type field, an LR_first_flag field, a spatial_flipping_flag field, an image0_flipped_flag field, and a quincunx_filtering_flag field of the stereoscopic video format descriptor.
- the 3D signaling information is generated using the obtained 3D image additional information, and the generated 3D signaling information is recorded in the fifteenth byte PB14 of the AVI InfoFrame packet content, and then the HDMI sink 300 is connected through the HDMI cable 400.
- the information obtained from the service_type field is displayed in the SV field of the 15th byte (PB14) of the content of the AVI InfoFrame packet, the information obtained from the composition_type field is displayed in the CT field, obtained from the LR_first_flag field The information is displayed in the OR field.
- Information obtained from the spatial_flipping_flag field and the image0_flipped_flag field is displayed in the FL field, and information obtained from the quincunx_filtering_flag field is displayed in the QS field.
- FIG. 13 is a flowchart illustrating an embodiment of a process of receiving and displaying video data and 3D signaling information through an HDMI cable in the HDMI sink 300 according to the present invention.
- the video data and the AVI InfoFrame packet are received through the HDMI cable 400 (S501).
- the 3D signaling information is obtained from the fifteenth byte PB14 of the AVI InfoFrame packet.
- the video data is re-formatted according to the display format of the HDMI sink 300 based on the obtained 3D signaling information (S502).
- S503 3D images are generated and displayed using various images using the left and right images of the reformatted video data.
- the embodiment of the reception system and the data processing method according to the present invention can be used in the field of broadcasting and communication.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims (12)
- 3D (dimensions) 영상과 상기 3D 영상의 부가 정보를 포함하는 시스템 정보를 수신하고, 상기 시스템 정보에 포함된 3D 영상 부가 정보를 기초로 3D 시그널링 정보를 생성한 후 상기 3D 영상과 함께 디지털 인터페이스를 통해 전송하는 영상 수신부; 및상기 디지털 인터페이스를 통해 상기 3D 영상과 3D 시그널링 정보를 수신하고, 수신된 3D 시그널링 정보를 기초로 상기 3D 영상을 포맷팅하여 디스플레이하는 디스플레이부를 포함하는 것을 특징으로 하는 수신 시스템.
- 제 1 항에 있어서,상기 3D 영상 부가 정보는 상기 시스템 정보의 프로그램 맵 테이블(PMT)에 디스크립터 형태로 포함되어 수신되는 것을 특징으로 하는 수신 시스템.
- 제 2 항에 있어서,상기 3D 영상 부가 정보는 3D 영상이 수신되는지를 지시하는 필드, 상기 3D 영상의 전송 포맷을 표시하는 필드, 수신된 프레임 내 가장 좌측의 최상단 픽셀이 좌 영상에 속하는지, 우 영상에 속하는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 적어도 하나가 역스캔되어 코딩되었는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 어느 영상이 역스캔되었는지를 지시하는 필드, 및 상기 좌 영상과 우 영상 중 적어도 하나가 필터를 이용하여 샘플링되었는지를 지시하는 필드를 포함하는 것을 특징으로 하는 수신 시스템.
- 제 3 항에 있어서, 상기 영상 수신부는상기 3D 영상 부가 정보로부터 생성된 3D 시그널링 정보를 AVI InfoFrame 패킷에 삽입하여 전송하는 것을 특징으로 하는 수신 시스템.
- 제 4 항에 있어서,상기 AVI InfoFrame 패킷은 헤더와 콘텐츠 영역으로 구성되며, 상기 콘텐츠 영역의 적어도 하나의 바이트에 하나 이상의 필드가 할당되어 상기 3D 시그널링 정보가 기록되는 것을 특징으로 하는 수신 시스템.
- 제 5 항에 있어서,상기 AVI InfoFrame 패킷의 콘텐츠 영역의 적어도 하나의 바이트는 3D 영상이 수신되는지를 지시하는 필드, 상기 3D 영상의 전송 포맷을 표시하는 필드, 수신된 프레임 내 가장 좌측의 최상단 픽셀이 좌 영상에 속하는지, 우 영상에 속하는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 어느 영상이 역스캔되었는지를 지시하는 필드, 및 상기 좌 영상과 우 영상 중 적어도 하나가 필터를 이용하여 샘플링되었는지를 지시하는 필드를 포함하는 것을 특징으로 하는 수신 시스템.
- 3D (dimensions) 영상과 상기 3D 영상의 부가 정보를 포함하는 시스템 정보를 수신하고, 상기 시스템 정보에 포함된 3D 영상 부가 정보를 기초로 3D 시그널링 정보를 생성한 후 상기 3D 영상과 함께 디지털 인터페이스를 통해 전송하는 단계; 및상기 디지털 인터페이스를 통해 상기 3D 영상과 3D 시그널링 정보를 수신하고, 수신된 3D 시그널링 정보를 기초로 상기 3D 영상을 포맷팅하여 디스플레이하는 단계를 포함하는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
- 제 7 항에 있어서,상기 3D 영상 부가 정보는 상기 시스템 정보의 프로그램 맵 테이블(PMT)에 디스크립터 형태로 포함되어 수신되는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
- 제 8 항에 있어서,상기 3D 영상 부가 정보는 3D 영상이 수신되는지를 지시하는 필드, 상기 3D 영상의 전송 포맷을 표시하는 필드, 수신된 프레임 내 가장 좌측의 최상단 픽셀이 좌 영상에 속하는지, 우 영상에 속하는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 적어도 하나가 역스캔되어 코딩되었는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 어느 영상이 역스캔되었는지를 지시하는 필드, 및 상기 좌 영상과 우 영상 중 적어도 하나가 필터를 이용하여 샘플링되었는지를 지시하는 필드를 포함하는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
- 제 9 항에 있어서,상기 3D 영상 부가 정보로부터 생성된 3D 시그널링 정보는 AVI InfoFrame 패킷에 삽입되어 전송되는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
- 제 10 항에 있어서,상기 AVI InfoFrame 패킷은 헤더와 콘텐츠 영역으로 구성되며, 상기 콘텐츠 영역의 적어도 하나의 바이트에 하나 이상의 필드가 할당되어 상기 3D 시그널링 정보가 기록되는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
- 제 11 항에 있어서,상기 AVI InfoFrame 패킷의 콘텐츠 영역의 적어도 하나의 바이트는 3D 영상이 수신되는지를 지시하는 필드, 상기 3D 영상의 전송 포맷을 표시하는 필드, 수신된 프레임 내 가장 좌측의 최상단 픽셀이 좌 영상에 속하는지, 우 영상에 속하는지를 지시하는 필드, 상기 좌 영상과 우 영상 중 어느 영상이 역스캔되었는지를 지시하는 필드, 및 상기 좌 영상과 우 영상 중 적어도 하나가 필터를 이용하여 샘플링되었는지를 지시하는 필드를 포함하는 것을 특징으로 하는 수신 시스템의 데이터 처리 방법.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP09819412A EP2343907A4 (en) | 2008-10-10 | 2009-10-09 | RECEPTION SYSTEM AND METHOD OF PROCESSING DATA |
| KR1020117010190A KR101664419B1 (ko) | 2008-10-10 | 2009-10-09 | 수신 시스템 및 데이터 처리 방법 |
| CA2740139A CA2740139C (en) | 2008-10-10 | 2009-10-09 | Reception system and data processing method |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US10427408P | 2008-10-10 | 2008-10-10 | |
| US61/104,274 | 2008-10-10 | ||
| US17319609P | 2009-04-27 | 2009-04-27 | |
| US61/173,196 | 2009-04-27 | ||
| US24065709P | 2009-09-09 | 2009-09-09 | |
| US61/240,657 | 2009-09-09 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2010041905A2 true WO2010041905A2 (ko) | 2010-04-15 |
| WO2010041905A3 WO2010041905A3 (ko) | 2010-07-15 |
Family
ID=44114240
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2009/005800 Ceased WO2010041905A2 (ko) | 2008-10-10 | 2009-10-09 | 수신 시스템 및 데이터 처리 방법 |
Country Status (5)
| Country | Link |
|---|---|
| US (4) | US8427469B2 (ko) |
| EP (1) | EP2343907A4 (ko) |
| KR (1) | KR101664419B1 (ko) |
| CA (1) | CA2740139C (ko) |
| WO (1) | WO2010041905A2 (ko) |
Families Citing this family (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8692695B2 (en) | 2000-10-03 | 2014-04-08 | Realtime Data, Llc | Methods for encoding and decoding data |
| US8427469B2 (en) * | 2008-10-10 | 2013-04-23 | Lg Electronics Inc. | Receiving system and method of processing data |
| IT1393713B1 (it) * | 2008-10-21 | 2012-05-08 | S I R I Société Internationale De Rech Ind S A | Sistema per codificare e decodificare immagini stereoscopiche |
| US8289998B2 (en) * | 2009-02-13 | 2012-10-16 | Samsung Electronics Co., Ltd. | Method and apparatus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream |
| CN102804789B (zh) * | 2009-06-23 | 2015-04-29 | Lg电子株式会社 | 接收系统和提供3d图像的方法 |
| CN102474641B (zh) * | 2009-07-07 | 2015-05-13 | Lg电子株式会社 | 用于显示三维用户界面的方法 |
| US8493434B2 (en) * | 2009-07-14 | 2013-07-23 | Cable Television Laboratories, Inc. | Adaptive HDMI formatting system for 3D video transmission |
| US20120144441A1 (en) * | 2009-08-12 | 2012-06-07 | Chang Sik Yun | Method for diagnosing 3d state information, and broadcast receiver |
| WO2011028024A2 (en) * | 2009-09-03 | 2011-03-10 | Lg Electronics Inc. | Cable broadcast receiver and 3d video data processing method thereof |
| US9131215B2 (en) * | 2009-09-29 | 2015-09-08 | Samsung Electronics Co., Ltd. | Method and apparatus for transmitting and receiving uncompressed three-dimensional video data via digital data interface |
| WO2011046279A1 (en) * | 2009-10-16 | 2011-04-21 | Lg Electronics Inc. | Method for indicating a 3d contents and apparatus for processing a signal |
| KR20110062987A (ko) * | 2009-12-04 | 2011-06-10 | 삼성전자주식회사 | 3차원 영상 제공방법, 3차원 영상 표시방법 및 이를 적용한 3차원 영상 제공장치 및 3차원 영상 디스플레이 장치 |
| KR20110063002A (ko) * | 2009-12-04 | 2011-06-10 | 삼성전자주식회사 | 3차원 디스플레이 장치 및 이에 적용되는 3차원 영상 검출방법 |
| US20110134217A1 (en) * | 2009-12-08 | 2011-06-09 | Darren Neuman | Method and system for scaling 3d video |
| EP2337362A3 (en) * | 2009-12-21 | 2013-07-17 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
| KR20110088334A (ko) * | 2010-01-28 | 2011-08-03 | 삼성전자주식회사 | 3차원 멀티미디어 서비스를 제공하기 위한 데이터스트림 생성 방법 및 장치, 3차원 멀티미디어 서비스를 제공하기 위한 데이터스트림 수신 방법 및 장치 |
| US20110216083A1 (en) * | 2010-03-03 | 2011-09-08 | Vizio, Inc. | System, method and apparatus for controlling brightness of a device |
| US8830300B2 (en) * | 2010-03-11 | 2014-09-09 | Dolby Laboratories Licensing Corporation | Multiscalar stereo video format conversion |
| JP5515988B2 (ja) * | 2010-04-05 | 2014-06-11 | ソニー株式会社 | 信号処理装置、信号処理方法、表示装置及びプログラム |
| FR2965444B1 (fr) | 2010-09-24 | 2012-10-05 | St Microelectronics Grenoble 2 | Transmission de video 3d sur une infrastructure de transport historique |
| KR101750047B1 (ko) * | 2010-10-11 | 2017-06-22 | 삼성전자주식회사 | 3차원 영상 제공 및 처리 방법과 3차원 영상 제공 및 처리 장치 |
| KR20120042313A (ko) * | 2010-10-25 | 2012-05-03 | 삼성전자주식회사 | 입체영상표시장치 및 그 영상표시방법 |
| KR101734285B1 (ko) * | 2010-12-14 | 2017-05-11 | 엘지전자 주식회사 | 이동 단말기의 영상 처리 장치 및 그 방법 |
| KR20120105199A (ko) * | 2011-03-15 | 2012-09-25 | 엘지디스플레이 주식회사 | 다중 시청 가능한 입체영상 표시장치 및 그 구동방법 |
| KR20130002856A (ko) * | 2011-06-29 | 2013-01-08 | 삼성전자주식회사 | 멀티미디어 시스템에서의 클록 생성 방법 및 그 장치 |
| US9294752B2 (en) | 2011-07-13 | 2016-03-22 | Google Technology Holdings LLC | Dual mode user interface system and method for 3D video |
| EP2781103A1 (en) * | 2011-11-14 | 2014-09-24 | Motorola Mobility LLC | Association of mvc stereoscopic views to left or right eye display for 3dtv |
| JPWO2013108531A1 (ja) * | 2012-01-19 | 2015-05-11 | ソニー株式会社 | 受信装置、受信方法および電子機器 |
| US9253515B2 (en) * | 2012-03-11 | 2016-02-02 | Broadcom Corporation | Channel bonding synchronization |
| KR101917224B1 (ko) * | 2012-09-18 | 2018-11-13 | 엘지이노텍 주식회사 | 영상데이터 전송장치 |
| TWI711310B (zh) | 2013-06-21 | 2020-11-21 | 日商新力股份有限公司 | 送訊裝置、高動態範圍影像資料送訊方法、收訊裝置、高動態範圍影像資料收訊方法及程式 |
| WO2016129976A1 (ko) * | 2015-02-14 | 2016-08-18 | 삼성전자 주식회사 | 시스템 데이터를 포함하는 오디오 비트스트림을 디코딩하는 방법 및 장치 |
Family Cites Families (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000011602A2 (en) * | 1998-08-20 | 2000-03-02 | Apple Computer, Inc. | Method and apparatus for generating texture |
| JP2000078611A (ja) | 1998-08-31 | 2000-03-14 | Toshiba Corp | 立体映像受信装置及び立体映像システム |
| US20040218269A1 (en) * | 2002-01-14 | 2004-11-04 | Divelbiss Adam W. | General purpose stereoscopic 3D format conversion system and method |
| CA2380105A1 (en) * | 2002-04-09 | 2003-10-09 | Nicholas Routhier | Process and system for encoding and playback of stereoscopic video sequences |
| AU2003231510A1 (en) * | 2002-04-25 | 2003-11-10 | Sharp Kabushiki Kaisha | Image data creation device, image data reproduction device, and image data recording medium |
| JP4259884B2 (ja) * | 2003-01-20 | 2009-04-30 | シャープ株式会社 | 画像データ作成装置およびそのデータを再生する画像データ再生装置 |
| KR100697972B1 (ko) * | 2004-11-16 | 2007-03-23 | 한국전자통신연구원 | 입체 방송 서비스를 위한 디지털 방송 송신 장치 및 그 방법 |
| KR100657322B1 (ko) * | 2005-07-02 | 2006-12-14 | 삼성전자주식회사 | 로컬 3차원 비디오를 구현하기 위한 인코딩/디코딩 방법 및장치 |
| KR100766496B1 (ko) * | 2005-07-08 | 2007-10-15 | 삼성전자주식회사 | 에이치디엠아이 전송 시스템 |
| US7929560B2 (en) * | 2005-07-15 | 2011-04-19 | Panasonic Corporation | Packet transmitting apparatus |
| WO2007067020A1 (en) * | 2005-12-09 | 2007-06-14 | Electronics And Telecommunications Research Institute | System and method for transmitting/receiving three dimensional video based on digital broadcasting |
| KR100747598B1 (ko) | 2005-12-09 | 2007-08-08 | 한국전자통신연구원 | 디지털방송 기반의 3차원 입체영상 송수신 시스템 및 그방법 |
| US7635189B2 (en) * | 2005-12-21 | 2009-12-22 | International Business Machines Corporation | Method and system for synchronizing opto-mechanical filters to a series of video synchronization pulses and derivatives thereof |
| JP4179387B2 (ja) * | 2006-05-16 | 2008-11-12 | ソニー株式会社 | 伝送方法、伝送システム、送信方法、送信装置、受信方法及び受信装置 |
| KR101362941B1 (ko) * | 2006-11-01 | 2014-02-17 | 한국전자통신연구원 | 스테레오스코픽 콘텐츠 재생에 이용되는 메타 데이터의복호화 방법 및 장치 |
| WO2008054100A1 (en) | 2006-11-01 | 2008-05-08 | Electronics And Telecommunications Research Institute | Method and apparatus for decoding metadata used for playing stereoscopic contents |
| JP5153216B2 (ja) * | 2007-06-08 | 2013-02-27 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
| JP2009104359A (ja) * | 2007-10-23 | 2009-05-14 | Mitsubishi Chemicals Corp | 適正在庫算出方法及び装置、銘柄の需要の傾向判定方法及び装置、並びに、プログラム |
| KR101506219B1 (ko) * | 2008-03-25 | 2015-03-27 | 삼성전자주식회사 | 3차원 영상 컨텐츠 제공 방법, 재생 방법, 그 장치 및 그기록매체 |
| KR101472332B1 (ko) * | 2008-04-15 | 2014-12-15 | 삼성전자주식회사 | 3차원 디지털 컨텐츠의 제공 방법, 수신 방법 및 그 장치 |
| US8427469B2 (en) * | 2008-10-10 | 2013-04-23 | Lg Electronics Inc. | Receiving system and method of processing data |
| JP5185794B2 (ja) * | 2008-12-01 | 2013-04-17 | シャープ株式会社 | 無線伝送システム |
| JP5469911B2 (ja) * | 2009-04-22 | 2014-04-16 | ソニー株式会社 | 送信装置および立体画像データの送信方法 |
| KR101677527B1 (ko) * | 2009-09-22 | 2016-11-21 | 삼성전자주식회사 | 복수의 입력소스들로부터의 비디오 신호들을 디스플레이하기 위한 방법 및 장치 |
| JP5577805B2 (ja) * | 2010-04-08 | 2014-08-27 | ソニー株式会社 | 情報処理装置、情報記録媒体、および情報処理方法、並びにプログラム |
| US9237366B2 (en) * | 2010-04-16 | 2016-01-12 | Google Technology Holdings LLC | Method and apparatus for distribution of 3D television program materials |
| JP5754080B2 (ja) * | 2010-05-21 | 2015-07-22 | ソニー株式会社 | データ送信装置、データ受信装置、データ送信方法およびデータ受信方法 |
-
2009
- 2009-10-09 US US12/588,302 patent/US8427469B2/en not_active Expired - Fee Related
- 2009-10-09 WO PCT/KR2009/005800 patent/WO2010041905A2/ko not_active Ceased
- 2009-10-09 CA CA2740139A patent/CA2740139C/en active Active
- 2009-10-09 KR KR1020117010190A patent/KR101664419B1/ko active Active
- 2009-10-09 EP EP09819412A patent/EP2343907A4/en not_active Withdrawn
-
2012
- 2012-07-06 US US13/543,268 patent/US20130002819A1/en not_active Abandoned
-
2013
- 2013-04-11 US US13/861,096 patent/US9204125B2/en not_active Expired - Fee Related
-
2015
- 2015-11-04 US US14/932,491 patent/US9762888B2/en not_active Expired - Fee Related
Non-Patent Citations (2)
| Title |
|---|
| None |
| See also references of EP2343907A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| US8427469B2 (en) | 2013-04-23 |
| KR20110090914A (ko) | 2011-08-10 |
| EP2343907A4 (en) | 2012-06-20 |
| US9762888B2 (en) | 2017-09-12 |
| US9204125B2 (en) | 2015-12-01 |
| US20130002819A1 (en) | 2013-01-03 |
| US20100225645A1 (en) | 2010-09-09 |
| CA2740139C (en) | 2014-05-13 |
| EP2343907A2 (en) | 2011-07-13 |
| CA2740139A1 (en) | 2010-04-15 |
| WO2010041905A3 (ko) | 2010-07-15 |
| KR101664419B1 (ko) | 2016-10-10 |
| US20160080723A1 (en) | 2016-03-17 |
| US20130293674A1 (en) | 2013-11-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2010041905A2 (ko) | 수신 시스템 및 데이터 처리 방법 | |
| WO2010041896A2 (en) | Receiving system and method of processing data | |
| WO2010087574A2 (en) | Broadcast receiver and video data processing method thereof | |
| WO2010126221A2 (en) | Broadcast transmitter, broadcast receiver and 3d video data processing method thereof | |
| WO2010117129A2 (en) | Broadcast transmitter, broadcast receiver and 3d video data processing method thereof | |
| WO2010071291A1 (ko) | 3차원 영상신호 처리 방법과 이를 구현하기 위한 영상표시 장치 | |
| WO2010147289A1 (en) | Broadcast transmitter, broadcast receiver and 3d video processing method thereof | |
| WO2010093115A2 (en) | Broadcast receiver and 3d subtitle data processing method thereof | |
| WO2013019042A1 (ko) | 실시간으로 전송되는 기준 영상과 별도로 전송되는 부가 영상 및 콘텐츠를 연동하여 3d 서비스를 제공하기 위한 전송 장치 및 방법, 및 수신 장치 및 방법 | |
| US20140078248A1 (en) | Transmitting apparatus, transmitting method, receiving apparatus, and receiving method | |
| WO2010085074A2 (en) | Three-dimensional subtitle display method and three-dimensional display device for implementing the same | |
| WO2011084021A2 (ko) | 방송 수신기 및 3d 이미지 디스플레이 방법 | |
| WO2011046271A1 (en) | Broadcast receiver and 3d video data processing method thereof | |
| WO2011049372A2 (en) | Method and apparatus for generating stream and method and apparatus for processing stream | |
| WO2011028024A2 (en) | Cable broadcast receiver and 3d video data processing method thereof | |
| WO2012070715A1 (ko) | 디지털 방송의 전송 모드 제공 및 인지 방법 | |
| WO2012070716A1 (ko) | 디지털 방송에서 서비스 호환 방식 전송 방법 | |
| CN104081767A (zh) | 发送装置、发送方法、接收装置和接收方法 | |
| EP2645724A1 (en) | Transmitting apparatus, transmitting method, receiving apparatus and receiving method | |
| WO2012057564A2 (ko) | 모바일 환경에서 3차원 방송 신호를 수신하기 위한 수신 장치 및 방법 | |
| WO2009108028A9 (ko) | 자유시점 영상 복호화 방법 및 장치 | |
| WO2011087303A2 (ko) | 방송 신호 수신기 및 비디오 데이터 처리 방법 | |
| WO2012074331A2 (ko) | 스테레오스코픽 영상 정보의 전송 방법 및 장치 | |
| WO2013058455A1 (ko) | 비디오 신호의 보조 데이터 공간에 동기 정보를 추가하여 영상을 동기화하는 장치 및 방법 | |
| EP2656614A2 (en) | Improvements to subtitles for three dimensional video transmission |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09819412 Country of ref document: EP Kind code of ref document: A2 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2740139 Country of ref document: CA |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2009819412 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2009819412 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 20117010190 Country of ref document: KR Kind code of ref document: A |