[go: up one dir, main page]

US20100098155A1 - Parallel CABAC Decoding Using Entropy Slices - Google Patents

Parallel CABAC Decoding Using Entropy Slices Download PDF

Info

Publication number
US20100098155A1
US20100098155A1 US12/572,854 US57285409A US2010098155A1 US 20100098155 A1 US20100098155 A1 US 20100098155A1 US 57285409 A US57285409 A US 57285409A US 2010098155 A1 US2010098155 A1 US 2010098155A1
Authority
US
United States
Prior art keywords
entropy
context
entropy slice
context model
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/572,854
Inventor
Mehmet Umut Demircin
Madhukar Budagavi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Texas Instruments Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/572,854 priority Critical patent/US20100098155A1/en
Assigned to TEXAS INSTRUMENTS INCORPORATED reassignment TEXAS INSTRUMENTS INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BUDAGAVI, MADHUKAR, DEMIRCIN, MEHMET UMUT
Publication of US20100098155A1 publication Critical patent/US20100098155A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/46Conversion to or from run-length codes, i.e. by representing the number of consecutive digits, or groups of digits, of the same kind by a code word and a digit indicative of that kind
    • H03M7/48Conversion to or from run-length codes, i.e. by representing the number of consecutive digits, or groups of digits, of the same kind by a code word and a digit indicative of that kind alternating with other codes during the code conversion process, e.g. run-length coding being performed only as long as sufficientlylong runs of digits of the same kind are present
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/40Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code
    • H03M7/4006Conversion to or from arithmetic code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • video communication e.g., video communication, security and surveillance, industrial automation, and entertainment (e.g., DV, HDTV, satellite TV, set-top boxes, Internet video streaming, digital cameras, video jukeboxes, high-end displays and personal video recorders).
  • entertainment e.g., DV, HDTV, satellite TV, set-top boxes, Internet video streaming, digital cameras, video jukeboxes, high-end displays and personal video recorders.
  • video applications are becoming increasingly mobile as a result of higher computation power in handsets, advances in battery technology, and high-speed wireless connectivity.
  • Video compression and decompression is an essential enabler for digital video products.
  • Compression-decompression (CODEC) algorithms enable storage and transmission of digital video.
  • codecs are industry standards such as MPEG-2, MPEG-4, H.264/AVC, etc.
  • prediction block motion compensation
  • transform coding is used to remove spatial redundancy within each block.
  • Block motion compensation schemes basically assume that between successive pictures, i.e., frames, in a video sequence, an object in a scene undergoes a displacement in the x- and y-directions and these displacements define the components of a motion vector.
  • an object in one picture can be predicted from the object in a prior picture by using the motion vector of the object.
  • each frame is tiled into blocks often referred to as macroblocks.
  • Block-based motion estimation algorithms are used to generate a set of vectors to describe block motion flow between frames, thereby constructing a motion-compensated prediction of a frame.
  • the vectors are determined using block-matching procedures that try to identify the most similar blocks in the current frame with those that have already been encoded in prior frames.
  • Context-adaptive binary arithmetic coding is a form of entropy coding used in video coding such as H.264/MPEG-4 AVC. As such it is an inherently lossless compression technique. It is notable for providing considerably better compression than most other encoding algorithms used in video encoding and is considered one of the primary advantages of the H.264/AVC encoding scheme.
  • CABAC is only supported in Main and higher profiles and requires a considerable amount of processing to decode compared to other similar algorithms.
  • context-adaptive variable-length coding CAVLC
  • CABAC achieves 9%-14% better compression compared to CAVLC, with the cost of increased complexity.
  • CABAC CABAC has multiple probability models for different contexts, i.e., multiple context models. It first converts all non-binary symbols to binary. Then, for each bit, the CABAC coder selects which probability model to use, then uses information from nearby elements to optimize the probability estimate. Arithmetic coding is then applied to compress the data.
  • Efficient coding of syntax element values in a hybrid block-based video coder can be achieved by employing a binarization scheme as a kind of preprocessing unit for the subsequent stages of context modeling and binary arithmetic coding.
  • a binarization scheme defines a unique mapping of syntax element values to sequences of binary decisions, so-called bins, which can also be interpreted in terms of a binary code tree.
  • each bin value is encoded by using the regular binary arithmetic-coding engine, where the associated probability model is either determined by a fixed choice, without any context modeling, or adaptively chosen depending on a related context model.
  • Context models are identified by a context index that is selected from 460 possible values (except High 4:4:4 Intra and High 4:4:4 Predictive profiles).
  • each context model is determined by two parameters, a probability state index (pStateldx) representing the current estimate of the probability of the least probable symbol (LPS) and the binary value of the current most probable symbol (MPS). These two parameters are referred to as context state variables or context variables. Default initial values defined in the H.264 standard are used to initialize the context variables for a context model and the variable values are updated after each bin is encoded.
  • encoding of the given bin value depends on the actual state of the associated adaptive probability model that is passed along with the bin value to the multiplication-free Modulo (M) coder, which is a table-based binary arithmetic coding engine used in CABAC.
  • M multiplication-free Modulo
  • Probability estimation in CABAC is based on a table-driven estimator in which each probability model can take one of 64 different states with associated probability values p ranging in the interval 0.0-0.5.
  • LPS least probable symbol
  • MPS most probable symbol
  • CABAC CABAC
  • entropy slices Parallel Entropy Decoding for High Resolution Video Coding
  • an entropy slice as described in Zhao is similar to the slice concept used in H.264 but it is only applied to entropy coding and decoding.
  • the Zhao entropy slice includes a sequence of entropy encoded macroblocks. Syntax, i.e., an entropy header, is included in the output bit stream of the entropy encoder to identify the start of each entropy slice. Further, each entropy slice is defined such that it can be decoded by a CABAC entropy decoder independent of other entropy slices.
  • Zhao entropy slice Other specific features of the Zhao entropy slice are that in the CABAC entropy decoder, all context models are reinitialized to their initial default states at the beginning of each entropy slice. Further, during decoding, the context state is only updated within an entropy slice, and context model updates are not made across entropy slice boundaries. In addition, macroblocks in other entropy slices are marked as unavailable for the purpose of entropy decoding.
  • FIG. 1 shows a block diagram of a video encoding/decoding system in accordance with one or more embodiments of the invention
  • FIG. 2 shows a block diagram of a video encoder in accordance with one or more embodiments of the invention
  • FIG. 3 shows a block diagram of a video decoder in accordance with one or more embodiments of the invention
  • FIG. 4 shows a flow diagram of a method of CABAC encoding in accordance with one or more embodiments of the invention
  • FIGS. 5 and 6 show flow diagrams of methods of parallel CABAC decoding in accordance with one or more embodiments of the invention.
  • FIGS. 7-9 show block diagrams of illustrative digital systems in accordance with one or more embodiments of the invention.
  • the second reason is that the selection of context models for some syntax elements in CABAC, such as motion vector difference, is improved by using information from neighboring macroblocks.
  • the top neighbors of blocks in the upper row of slices are not available. As the size of an entropy slice size is reduced, the percentage of blocks in the top row will increase, thus reducing the accuracy of the context models that rely on information from neighboring macroblocks.
  • Embodiments of the invention provide CABAC encoding and decoding based on entropy slices that may reduce the observed bit rate increases attributed to the above two reasons. More specifically, in some embodiments of the invention, additional information for context model initialization is included in the entropy header of an entropy slice by the CABAC entropy encoder. As is described in more detail below, a CABAC entropy decoder can use this additional information to initialize the state of selected context models rather than resetting the context models to their default initial states. Further, in some embodiments of the invention, the parallel decoding of entropy slices is structured such that information from previously decoded entropy slices may used to estimate the initial context states for context models in subsequent entropy slices.
  • FIG. 1 shows a block diagram of a video encoding/decoding system in accordance with one or more embodiments of the invention.
  • the video encoding/decoding system performs encoding and decoding of digital video sequences using methods for CABAC entropy encoding and decoding as described herein.
  • the system includes a source digital system ( 100 ) that transmits encoded video sequences to a destination digital system ( 102 ) via a communication channel ( 116 ).
  • the source digital system ( 100 ) includes a video capture component ( 104 ), a video encoder component ( 106 ), and a transmitter component ( 108 ).
  • the video capture component ( 104 ) is configured to provide a video sequence to be encoded by the video encoder component ( 106 ).
  • the video capture component ( 104 ) may be for example, a video camera, a video archive, or a video feed from a video content provider.
  • the video capture component ( 104 ) may generate computer graphics as the video sequence, or a combination of live video and computer-generated video.
  • the video encoder component ( 106 ) receives a video sequence from the video capture component ( 104 ) and encodes it for transmission by the transmitter component ( 108 ).
  • the video encoder component ( 106 ) receives the video sequence from the video capture component ( 104 ) as a sequence of video frames, divides the frames into coding units which may be a whole frame or a slice of a frame, divides the coding units into blocks of pixels, and encodes the video data in the coding units based on these blocks.
  • the video encoder ( 106 ) includes functionality to perform one or more embodiments of methods for CABAC entropy encoding as described herein.
  • the transmitter component ( 108 ) transmits the encoded video data to the destination digital system ( 102 ) via the communication channel ( 116 ).
  • the communication channel ( 116 ) may be any communication medium, or combination of communication media suitable for transmission of the encoded video sequence, such as, for example, wired or wireless communication media, a local area network, or a wide area network.
  • the video capture and encoding may take place at a different location and time than the transmission. For example, television programs and movies may be produced, encoded and stored on a disc or other storage devices. The stored movie or program may then be transmitted at a later time.
  • the destination digital system ( 102 ) includes a receiver component ( 110 ), a video decoder component ( 112 ) and a display component ( 114 ).
  • the receiver component ( 110 ) receives the encoded video data from the source digital system ( 100 ) via the communication channel ( 116 ) and provides the encoded video data to the video decoder component ( 112 ) for decoding.
  • the video decoder component ( 112 ) reverses the encoding process performed by the video encoder component ( 106 ) to reconstruct the frames of the video sequence.
  • the video decoder component ( 112 ) includes functionality to perform one or more embodiments of methods for CABAC entropy decoding as described herein.
  • the reconstructed video sequence may then be displayed on the display component ( 114 ).
  • the display component ( 114 ) may be any suitable display device such as, for example, a plasma display, a liquid crystal display (LCD), a light emitting diode (LED) display, etc.
  • the source digital system ( 100 ) may also include a receiver component and a video decoder component and/or the destination digital system ( 102 ) may include a transmitter component and a video encoder component for transmission of video sequences both directions for video steaming, video broadcasting, and video telephony.
  • the video encoder component ( 106 ) and the video decoder component ( 112 ) may perform encoding and decoding in accordance with a video compression standard such as, for example, the Moving Picture Experts Group (MPEG) video compression standards, e.g., MPEG-1, MPEG-2, and MPEG-4, the ITU-T video compressions standards, e.g., H.263 and H.264, the Society of Motion Picture and Television Engineers (SMPTE) 421 M video CODEC standard (commonly referred to as “VC-1”), the video compression standard defined by the Audio Video Coding Standard Workgroup of China (commonly referred to as “AVS”), etc.
  • MPEG Moving Picture Experts Group
  • MPEG-4 MPEG-1
  • MPEG-4 MPEG-4
  • ITU-T video compressions standards e.g., H.263 and H.264
  • SMPTE Society of Motion Picture and Television Engineers 421 M video CODEC standard
  • VC-1 the video compression standard defined by the Audio Video Cod
  • the video encoder component ( 106 ) and the video decoder component ( 112 ) may be implemented in any suitable combination of software, firmware, and hardware, such as, for example, one or more digital signal processors (DSPs), microprocessors, discrete logic, application specific integrated circuits (ASICs), etc.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FIG. 2 shows a block diagram of a video encoder, e.g., the video encoder ( 106 ), in accordance with one or more embodiments of the invention.
  • input frames ( 200 ) for encoding are provided as one input of a motion estimation component ( 220 ), as one input of an intraframe prediction component ( 224 ), and to a positive input of a combiner ( 202 ) (e.g., adder or subtractor or the like).
  • the frame storage component ( 218 ) provides reference data to the motion estimation component ( 220 ) and to the motion compensation component ( 222 ).
  • the reference data may include one or more previously encoded and decoded frames.
  • the motion estimation component ( 220 ) provides motion estimation information to the motion compensation component ( 222 ) and the entropy encoders ( 234 ). More specifically, the motion estimation component ( 220 ) performs tests based on the prediction modes to choose the best motion vector(s)/prediction mode. The motion estimation component ( 220 ) provides the selected motion vector (MV) or vectors and the selected prediction mode to the motion compensation component ( 222 ) and the selected motion vector (MV) to the entropy encoder component ( 234 ).
  • the motion compensation component ( 222 ) provides motion compensated prediction information to a selector switch ( 226 ) that includes motion compensated interframe prediction macroblocks (MBs).
  • the intraframe prediction component also provides intraframe prediction information to switch ( 226 ) that includes intraframe prediction MBs and a prediction mode. That is, similar to the motion estimation component ( 220 ), the intraframe prediction component performs tests based on prediction modes to choose the best prediction mode for generating the intraframe prediction MBs.
  • the switch ( 226 ) selects between the motion-compensated interframe prediction MBs from the motion compensation component ( 222 ) and the intraframe prediction MBs from the intraprediction component ( 224 ) based on the selected prediction mode.
  • the output of the switch ( 226 ) i.e., the selected prediction MB
  • the output of the delay component ( 230 ) is provided to another combiner (i.e., an adder) ( 238 ).
  • the combiner ( 202 ) subtracts the selected prediction MB from the current MB of the current input frame to provide a residual MB to the transform component ( 204 ).
  • the resulting residual MB is a set of pixel difference values that quantify differences between pixel values of the original MB and the prediction MB.
  • the transform component ( 204 ) performs a block transform such as DCT, on the residual MB to convert the residual pixel values to transform coefficients and outputs the transform coefficients.
  • the transform coefficients are provided to a quantization component ( 206 ) which outputs quantized transform coefficients. Because the DCT transform redistributes the energy of the residual signal into the frequency domain, the quantized transform coefficients are taken out of their raster-scan ordering and arranged by significance, generally beginning with the more significant coefficients followed by the less significant by a scan component ( 208 ). The ordered quantized transform coefficients provided via a scan component ( 208 ) to the entropy encoder component ( 234 ).
  • the entropy encoder component ( 234 ) performs entropy encoding on encoded macroblocks to generate a compressed bit stream ( 236 ) for transmission or storage.
  • the entropy encoder component ( 234 ) may include functionality to perform one or more of any suitable entropy encoding techniques, such as, for example, context adaptive variable length coding (CAVLC), context adaptive binary arithmetic coding (CABAC), run length coding, etc.
  • CAVLC context adaptive variable length coding
  • CABAC context adaptive binary arithmetic coding
  • run length coding etc.
  • the entropy encoder component ( 234 ) includes functionality to perform one or embodiments of methods for CABAC entropy encoding as described herein.
  • the embedded decoder Inside every encoder is an embedded decoder. As any compliant decoder is expected to reconstruct an image from a compressed bit stream, the embedded decoder provides the same utility to the video encoder. Knowledge of the reconstructed input allows the video encoder to transmit the appropriate residual energy to compose subsequent frames. To determine the reconstructed input, the ordered quantized transform coefficients provided via the scan component ( 208 ) are returned to their original post-DCT arrangement by an inverse scan component ( 210 ), the output of which is provided to a dequantize component ( 212 ), which outputs estimated transformed information, i.e., an estimated or reconstructed version of the transform result from the transform component ( 204 ).
  • the estimated transformed information is provided to the inverse transform component ( 214 ), which outputs estimated residual information which represents a reconstructed version of the residual MB.
  • the reconstructed residual MB is provided to the combiner ( 238 ).
  • the combiner ( 238 ) adds the delayed selected predicted MB to the reconstructed residual MB to generate an unfiltered reconstructed MB, which becomes part of reconstructed frame information.
  • the reconstructed frame information is provided via a buffer ( 228 ) to the intraframe prediction component ( 224 ) and to a filter component ( 216 ).
  • the filter component ( 216 ) is a deblocking filter which filters the reconstructed frame information and provides filtered reconstructed frames to frame storage component ( 218 ).
  • FIG. 3 shows a block diagram of a video decoder, e.g., the video decoder ( 112 ), in accordance with one or more embodiments of the invention.
  • the entropy decoding component 300 receives an entropy encoded video bit stream and reverses the entropy encoding to recover the encoded macroblocks.
  • the entropy decoding performed by the entropy decoder component ( 300 ) may include functionality to perform one or more of any suitable entropy decoding techniques, such as, for example, context adaptive variable length decoding (CAVLC), context adaptive binary arithmetic decoding (CABAC), run length decoding, etc.
  • the entropy decoder component ( 300 ) includes functionality to perform one or embodiments of methods for CABAC entropy decoding as described herein.
  • the inverse scan and dequantization component ( 302 ) assembles the macroblocks in the video bit stream in raster scan order and substantially recovers the original frequency domain data.
  • the inverse transform component ( 304 ) transforms the frequency domain data from inverse scan and dequantization component ( 302 ) back to the spatial domain. This spatial domain data supplies one input of the addition component ( 306 ).
  • the other input of addition component ( 306 ) comes from the macroblock mode switch ( 308 ).
  • the macroblock mode switch ( 308 ) selects the output of the motion compensation component ( 310 ).
  • the motion compensation component ( 310 ) receives reference frames from frame storage ( 312 ) and applies the motion compensation computed by the encoder and transmitted in the encoded video bit stream.
  • the macroblock mode switch ( 308 ) selects the output of the intra-prediction component ( 314 ).
  • the intra-prediction component ( 314 ) applies the intra-prediction computed by the encoder and transmitted in the encoded video bit stream.
  • the addition component ( 306 ) recovers the predicted frame.
  • the output of addition component ( 306 ) supplies the input of the deblocking filter component ( 316 ).
  • the deblocking filter component ( 316 ) smoothes artifacts created by the block and macroblock nature of the encoding process to improve the visual quality of the decoded frame.
  • the deblocking filter component ( 316 ) applies a macroblock-based loop filter for regular decoding to maximize performance and applies a frame-based loop filter for frames encoded using flexible macroblock ordering (FMO) and for frames encoded using arbitrary slice order (ASO).
  • FMO flexible macroblock ordering
  • ASO arbitrary slice order
  • the macroblock-based loop filter is performed after each macroblock is decoded, while the frame-based loop filter delays filtering until all macroblocks in the frame have been decoded.
  • a deblocking filter processes pixels across macroblock boundaries
  • the neighboring macroblocks are decoded before the filtering is applied.
  • performing the loop filter as each macroblock is decoded has the advantage of processing the pixels while they are in on-chip memory, rather than writing out pixels and reading them back in later, which consumes more power and adds delay.
  • macroblocks are decoded out of order, as with FMO or ASO, the pixels from neighboring macroblocks may not be available when the macroblock is decoded; in this case, macroblock-based loop filtering cannot be performed.
  • the loop filtering is delayed until after all macroblocks are decoded for the frame, and the pixels must be reread in a second pass to perform frame-based loop filtering.
  • the output of the deblocking filter component ( 316 ) is the decoded frames of the video bit stream. Each decoded frame is stored in frame storage ( 312 ) to be used as a reference frame.
  • FIG. 4 shows a flow diagram of a method of CABAC entropy encoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the generation of entropy slices as a part of CABAC entropy encoding.
  • slices of video data are partitioned into one or more entropy slices during CABAC entropy encoding of the syntax element values of the slices.
  • a slice may be a subset of macroblocks in a picture (frame) or may the entire picture.
  • a syntax element value of a slice is entropy encoded using CABAC ( 400 ) to generate a bin string representing the syntax element.
  • Consecutive syntax element values are encoded until either sufficient syntax elements have been encoded to fulfill the size criteria for an entropy slice or the last syntax element value in a slice is encoded ( 402 ).
  • the size of an entropy slice may be set, for example, as some number of macroblocks or a maximum number of bits.
  • an entropy slice header is generated for the entropy slice ( 404 ).
  • the entropy slice header includes information identifying the header as that of an entropy slice and context model initialization information to be used by a CABAC entropy decoder to initialize context models prior to decoding the entropy slice.
  • the context model initialization information may be, for example, initial values for context variables or parameters that can be used to calculate the initial values for context variables for selected context models. The generation of the context model initialization information is described in more detail below.
  • the entropy slice header and the entropy encoded syntax element values are then output in a bit stream ( 406 ). The process is repeated until all syntax element values in the slice are entropy encoded and included in an entropy slice ( 408 ).
  • the context model initialization information included in the entropy slice header is initialization information for a subset of the context models.
  • the size of the subset may be set to be very small compared to the entire context space and may be chosen as a tradeoff between providing more initialization information to improve the throughput of the entropy decoder and potential bit rate increase due to including initialization information for more context models.
  • the selection of which context models to include in the subset may be based on which of the context models is most frequently used. In other words, context initialization information for the more frequently used context models will be included in the subset in preference to context initialization information for less frequently used context models. For example, coefficient level syntax element bin strings account for a large portion of video bit streams. Therefore, context models corresponding to coefficient levels will be used most frequently in decoding of entropy slices and the corresponding context models may be given preference for inclusion in the subset of context models for which context initialization information is provided in an entropy slice header.
  • the context models included in the subset are statically defined. More specifically, when an entropy slice header is generated, the context model initialization information for a fixed, predefined subset of context models is included in the header.
  • the fixed, predefined subset of context models may vary by slice type (I-slice, B-slice, P-slice).
  • the subset of context models for a B-slice or a P-slice may include context models related to motion vectors while the subset for an I-slice may not.
  • the optimal context subsets may be determined empirically by statistical analysis of video sequences at various resolutions, bit rates, and configurations.
  • the context models included in the subset are dynamically selected.
  • the entropy encoder may keep track of how frequently each of the possible context models is used while encoding the syntax element values for an entropy slice. The most frequently used context models (based on a threshold) may then be selected for inclusion in the subset of context models for which context initialization information is provided in the entropy slice header.
  • context models in which the context variables values have deviated significantly (based on a threshold) from their initial default values during the encoding of the syntax element values in the entropy slice may be selected for inclusion in the subset.
  • the maximum size of the subset may be fixed even though the actual context models to be included in the subset may change dynamically.
  • the context indices for each of the selected context models is included in the entropy slice header as well as the context model initialization information.
  • the context models included in the subset may be both statically and dynamically defined. For example, a few context models may be always included in the subset while others may be dynamically selected.
  • the context initialization information included in the entropy slice header may be compressed.
  • the context variable values for a context model may be defined with a fixed number of bits, e.g., seven bits as in H.264.
  • the difference between the fixed bit value of the default initial context variable values defined for the context model and the fixed bit value of the context variable values after encoding of syntax element values in an entropy slice may be computed and only the difference included in the entropy slice header.
  • the difference values are also quantized and compressed using any appropriate known simple entropy encoding technique.
  • the context indices for the selected context models are also included in the entropy slice header.
  • the simple entropy encoding technique may also be used to compress the context indices.
  • FIG. 5 shows a flow diagram of a method of parallel CABAC entropy decoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the parallel decoding of entropy slices in a slice as a part of CABAC entropy decoding.
  • the entropy slice headers include context model initialization information.
  • the method as depicted in FIG. 5 assumes that some number N of entropy slices may be decoded in parallel. The value of the number N depends on, among other things, the processing capabilities of a digital system, i.e., the number of processing units available for parallel processing, on which the method is implemented.
  • a bit stream of encoded video data is parsed to identify entropy slice headers in a slice ( 500 ).
  • the decoding of the entropy slice is delegated to one of the N processing units.
  • the delegation may be to an idle processing unit and/or the delegation may take the form of enqueuing the entropy slice for decoding on the next available processing unit. Accordingly, an initial entropy slice in the bit stream can be delegated to a processing unit for decoding and the decoding initiated before the slice header of the next entropy slice is identified and the associated entropy slice is delegated to another processing unit for decoding.
  • Each identified entropy slice is then entropy decoded.
  • context models are initialized using the context model initialization information in the respective entropy slice header ( 502 a , 502 b , 502 c ). More specifically, the context model initialization information in the entropy slice header is parsed and used to initialize corresponding context models.
  • the context model initialization information in the entropy slice header is for a statically defined subset of the context models. In some embodiments of the invention, the context model initialization information in the entropy slice header is for a subset of the context models dynamically selected by the entropy encoder. In such embodiments, the respective entropy slice headers are parsed to determine the context indices of the context models for which context model initialization information is provided. Further, in one or more embodiments of the invention, the context model initialization information and context indices, if present, are compressed as previously described. In such embodiments, the context model initialization information and context indices, if present, are decompressed.
  • context models that were not included in the subset of context models for which context model initialization information is provided in the entropy slice header are reset to default initial states.
  • estimates of context model initialization information of such context models are generated based on the context model initialization information included in the entropy slice header. The estimates may be generated, for example, based on identified correlations between the evolution of probability models in different contexts. Such correlations may be characterized empirically by analysis of CABAC encoding of video sequences at various resolutions, bit rates, and configurations.
  • the entropy encoded syntax element values in the respective entropy slices are decoded ( 504 a , 504 b , 504 c ) and the decoded syntax element values are output ( 506 a , 506 b , 506 c ). The process is repeated until all entropy slices in the slice are decoded ( 508 ).
  • FIG. 6 shows a flow diagram of a method of parallel CABAC entropy decoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the parallel decoding of entropy slices in a slice as a part of CABAC entropy decoding.
  • the entropy slice headers do not include context model initialization information as previously described. In other embodiments of the invention, the entropy slice headers do include context model initialization information.
  • the method assumes a limitation M on the maximum number of entropy slices that may be entropy decoded in parallel. This limitation may be set by the video encoder that generated the bit stream to be decoded and communicated in the bit stream.
  • an entropy slice x+M can be allowed to have some dependency on one or more of the entropy slices 1 . . . x.
  • the context models for entropy slice x+M can be initialized with estimated values based on the final states of the context models from one or more, i.e., a subset, of entropy slices 1 . . . x.
  • the subset of the M entropy slices on which the context initialization of an entropy slice may depend is referred to as the reference entropy slice subset.
  • the video encoder in addition to communicating the maximum number M of entropy slices that may be entropy decoded in parallel, the video encoder also identifies in the entropy slice header of an entropy slice x+M which of the 1 . . . x preceding entropy slices are included in the reference entropy slice subset.
  • N entropy slices may be decoded in parallel.
  • the value of the number N depends on, among other things, the processing capabilities of a digital system, i.e., the number of processing units available for parallel processing, on which the method is implemented.
  • the value of N is less than or equal to M. That is, even if the digital system has more than M processing units, only M entropy slices are permitted to decoded in parallel.
  • a bit stream of encoded video data is parsed to identify entropy slice headers in a slice ( 600 ).
  • the decoding of the entropy slice is delegated to one of the N processing units.
  • the delegation may be to an idle processing unit and/or the delegation may take the form of enqueuing the entropy slice for decoding on the next available processing unit.
  • the delegation may also be scheduled to ensure that the decoding of an entropy slice is not started until the entropy slices in the reference entropy slice subset of the entropy slice are decoded.
  • Each identified entropy slice is then entropy decoded.
  • the context models are initialized ( 602 a , 602 b , 602 c ). If the entropy slice is not one of the first M entropy slices in the bit stream, the context models are initialized using context model initialization information estimated from the final states of the context models of the entropy slices in the reference entropy slice subset of the entropy slice.
  • the context model initialization information may be estimated, for example, by using the weighted combination of the probability models resulting from decoding the entropy slices in the reference entropy slice subset.
  • Combination weights can be calculated, for example, using the spatial pixel distance in a picture between an entropy slice and the reference entropy slices. If the entropy slice is one of the first M entropy slices, the context models are initialized in some other manner. In some embodiments of the invention, the context models for an entropy slice in the first M entropy slices are initialized using default initialization values. In some embodiments of the invention, the context models for an entropy slice in the first M entropy slices are initialized using context model initialization information from the respective entropy slice headers as previously described.
  • the entropy encoded syntax element values in the entropy slice is decoded ( 604 a , 604 b , 604 c ) and the decoded syntax element values are output ( 606 a , 606 b , 606 c ). The process is repeated until all entropy slices in the slice are decoded ( 608 ).
  • FIG. 7 shows a digital system ( 700 ) (e.g., a personal computer) that includes a processor ( 702 ), associated memory ( 704 ), a storage device ( 706 ), and numerous other elements and functionalities typical of digital systems (not shown).
  • a digital system may include multiple processors and/or one or more of the processors may be digital signal processors.
  • the digital system ( 700 ) may also include input means, such as a keyboard ( 708 ) and a mouse ( 710 ) (or other cursor control device), and output means, such as a monitor ( 712 ) (or other display device).
  • the digital system ( 700 ) may also include an image capture device (not shown) that includes circuitry (e.g., optics, a sensor, readout electronics) for capturing video sequences.
  • the digital system ( 700 ) may be connected to a network ( 714 ) (e.g., a local area network (LAN), a wide area network (WAN) such as the Internet, a cellular network, any other similar type of network and/or any combination thereof) via a network interface connection 720 .
  • Video image data may be received via the network.
  • LAN local area network
  • WAN wide area network
  • these input and output means may take other forms.
  • the digital system ( 700 ) may include a video encoder and/or video decoder with functionality to perform, respectively, embodiments of methods for CABAC encoding and parallel CABAC decoding as described herein.
  • the video decoder may be configured to decode video image data received over a network or from storage media coupled to storage module 706 .
  • the digital system ( 700 ) may be further configured to display the decoded video data stream, such as a movie or other type of video images, on monitor 712 .
  • one or more elements of the aforementioned digital system ( 700 ) may be located at a remote location and connected to the other elements over a network. Further, embodiments of the invention may be implemented on a distributed system having a plurality of nodes, where each portion of the system and software instructions may be located on a different node within the distributed system.
  • the node may be a digital system.
  • the node may be a processor with associated physical memory.
  • the node may alternatively be a processor with shared memory and/or resources.
  • Software instructions to perform embodiments of the invention may be stored on a computer readable medium such as a compact disc (CD), a diskette, a tape, a file, or any other computer readable storage device.
  • the software instructions may be distributed to the digital system ( 700 ) via removable memory (e.g., floppy disk, optical disk, flash memory, USB key), via a transmission path, etc.
  • FIG. 8 is a block diagram of a digital system (e.g., a mobile cellular telephone) ( 800 ) that may be configured to perform the methods described herein.
  • the signal processing unit (SPU) ( 802 ) includes a digital processing processor system (DSP) that includes embedded memory and security features.
  • DSP digital processing processor system
  • the analog baseband unit ( 804 ) receives a voice data stream from handset microphone ( 813 a ) and sends a voice data stream to the handset mono speaker ( 813 b ).
  • the analog baseband unit ( 804 ) also receives a voice data stream from the microphone ( 814 a ) and sends a voice data stream to the mono headset ( 814 b ).
  • the analog baseband unit ( 804 ) and the SPU ( 802 ) may be separate ICs.
  • the analog baseband unit ( 804 ) does not embed a programmable processor core, but performs processing based on configuration of audio paths, filters, gains, etc being setup by software running on the SPU ( 802 ).
  • the analog baseband processing is performed on the same processor and can send information to it for interaction with a user of the digital system ( 800 ) during a call processing or other processing.
  • the display ( 820 ) may also display pictures and video streams received from the network, from a local camera ( 828 ), or from other sources such as the USB ( 826 ) or the memory ( 812 ).
  • the SPU ( 802 ) may also send a video stream to the display ( 820 ) that is received from various sources such as the cellular network via the RF transceiver ( 806 ) or the camera ( 828 ).
  • the SPU ( 802 ) may also send a video stream to an external video display unit via the encoder ( 822 ) over a composite output terminal ( 824 ).
  • the encoder unit ( 822 ) may provide encoding according to PAL/SECAM/NTSC video standards.
  • the SPU ( 802 ) includes functionality to perform the computational operations required for video compression and decompression.
  • the video compression standards supported may include, for example, one or more of the JPEG standards, the MPEG standards, and the H.26x standards.
  • the SPU ( 802 ) is configured to perform the computational operations of one or more of the methods described herein.
  • Software instructions implementing aspects of the methods may be stored in the memory ( 812 ) and executed by the SPU ( 802 ) during encoding and decoding of video sequences.
  • FIG. 9 shows a digital system suitable for an embedded system (e.g., a digital camera) in accordance with one or more embodiments of the invention that includes, among other components, a DSP-based image coprocessor (ICP) ( 902 ), a RISC processor ( 904 ), and a video processing engine (VPE) ( 906 ) that may be configured to perform methods for digital image data compression and decompression described herein.
  • the RISC processor ( 904 ) may be any suitably configured RISC processor.
  • the VPE ( 906 ) includes a configurable video processing front-end (Video FE) ( 908 ) input interface used for video capture from imaging peripherals such as image sensors, video decoders, etc., a configurable video processing back-end (Video BE) ( 910 ) output interface used for display devices such as SDTV displays, digital LCD panels, HDTV video encoders, etc, and memory interface ( 924 ) shared by the Video FE ( 908 ) and the Video BE ( 910 ).
  • the digital system also includes peripheral interfaces ( 912 ) for various peripherals that may include a multi-media card, an audio serial port, a Universal Serial Bus (USB) controller, a serial port interface, etc.
  • the Video FE ( 908 ) includes an image signal processor (ISP) ( 916 ), and a 3A statistic generator (3A) ( 918 ).
  • the ISP ( 916 ) provides an interface to image sensors and digital video sources. More specifically, the ISP ( 916 ) may accept raw image/video data from a sensor (CMOS or CCD) and can accept YUV video data in numerous formats.
  • the ISP ( 916 ) also includes a parameterized image processing module with functionality to generate image data in a color format (e.g., RGB) from raw CCD/CMOS data.
  • the ISP ( 916 ) is customizable for each sensor type and supports video frame rates for preview displays of captured digital images and for video recording modes.
  • the ISP ( 916 ) also includes, among other functionality, an image resizer, statistics collection functionality, and a boundary signal calculator.
  • the 3A module ( 918 ) includes functionality to support control loops for auto focus, auto white balance, and auto exposure by collecting metrics on the raw image data from the ISP ( 916 ) or external memory.
  • the Video BE ( 910 ) includes an on-screen display engine (OSD) ( 920 ) and a video analog encoder (VAC) ( 922 ).
  • the OSD engine ( 920 ) includes functionality to manage display data in various formats for several different types of hardware display windows and it also handles gathering and blending of video data and display/bitmap data into a single display window before providing the data to the VAC ( 922 ) in YCbCr format.
  • the VAC ( 922 ) includes functionality to take the display frame from the OSD engine ( 920 ) and format it into the desired output format and output signals required to interface to display devices.
  • the VAC ( 922 ) may interface to composite NTSC/PAL video devices, S-Video devices, digital LCD devices, high-definition video encoders, DVI/HDMI devices, etc.
  • the memory interface ( 924 ) functions as the primary source and sink to modules in the Video FE ( 908 ) and the Video BE ( 910 ) that are requesting and/or transferring data to/from external memory.
  • the memory interface ( 924 ) includes read and write buffers and arbitration logic.
  • the ICP ( 902 ) includes functionality to perform the computational operations required for video encoding and decoding and other processing of captured images.
  • the video encoding standards supported may include one or more of the JPEG standards, the MPEG standards, and the H.26x standards.
  • the ICP ( 902 ) is configured to perform the computational operations of embodiments of the methods described herein.
  • video signals are received by the video FE ( 908 ) and converted to the input format needed to perform video encoding.
  • the video data generated by the video FE ( 908 ) is stored in then stored in external memory.
  • the video data is then encoded by a video encoder.
  • As the video data is encoded a method for CABAC entropy encoding as described herein is applied.
  • the resulting encoded video data is stored in the external memory.
  • the encoded video data may then read from the external memory, decoded by a video decoder that implements a method for parallel CABAC entropy decoding as described herein, and post-processed by the video BE ( 910 ) to display the image/video sequence.
  • Embodiments of the decoders and methods described herein may be provided on any of several types of digital systems: digital signal processors (DSPs), general purpose programmable processors, application specific circuits, or systems on a chip (SoC) such as combinations of a DSP and a reduced instruction set (RISC) processor together with various specialized accelerators.
  • DSPs digital signal processors
  • SoC systems on a chip
  • RISC reduced instruction set
  • a stored program in an onboard or external (flash EEP) ROM or FRAM may be used to implement aspects of the video signal processing.
  • Analog-to-digital converters and digital-to-analog converters provide coupling to the real world, modulators and demodulators (plus antennas for air interfaces) can provide coupling for waveform reception of video data being broadcast over the air by satellite, TV stations, cellular networks, etc or via wired networks such as the Internet.
  • the techniques described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the software may be executed in one or more processors, such as a microprocessor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), or digital signal processor (DSP).
  • the software that executes the techniques may be initially stored in a computer-readable medium such as compact disc (CD), a diskette, a tape, a file, memory, or any other computer readable storage device and loaded and executed in the processor.
  • the software may also be sold in a computer program product, which includes the computer-readable medium and packaging materials for the computer-readable medium.
  • the software instructions may be distributed via removable computer readable media (e.g., floppy disk, optical disk, flash memory, USB key), via a transmission path from computer readable media on another digital system, etc.
  • Embodiments of the methods and video decoders for performing parallel bin decoding as described herein may be implemented for virtually any type of digital system (e.g., a desk top computer, a laptop computer, a set-top box for satellite or cable, a handheld device such as a mobile (i.e., cellular) phone, a personal digital assistant, a digital camera, etc.) with functionality to decode digital video images.
  • a desk top computer e.g., a laptop computer, a set-top box for satellite or cable
  • a handheld device such as a mobile (i.e., cellular) phone, a personal digital assistant, a digital camera, etc.) with functionality to decode digital video images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

A method of video encoding is provided that includes performing context-adaptive binary arithmetic coding (CABAC) on a plurality of syntax element values in a slice to generate a plurality of entropy-encoded syntax element values, generating an entropy slice header to identify the plurality of entropy-encoded syntax element values as an entropy slice, wherein the entropy slice header comprises context model initialization information, and outputting the entropy slice header and the plurality of entropy encoded syntax element values.

Description

    CLAIM OF PRIORITY UNDER 35 U.S.C. 119(e)
  • The present application claims priority to U.S. Provisional Patent Application No. 61/106,323, filed Oct. 17, 2008, entitled “Method and Apparatus for Video Processing in Context-Adaptive Binary Arithmetic Coding,” which is incorporated by reference herein in its entirety.
  • BACKGROUND OF THE INVENTION
  • The demand for digital video products continues to increase. Some examples of applications for digital video include video communication, security and surveillance, industrial automation, and entertainment (e.g., DV, HDTV, satellite TV, set-top boxes, Internet video streaming, digital cameras, video jukeboxes, high-end displays and personal video recorders). Further, video applications are becoming increasingly mobile as a result of higher computation power in handsets, advances in battery technology, and high-speed wireless connectivity.
  • Video compression and decompression is an essential enabler for digital video products. Compression-decompression (CODEC) algorithms enable storage and transmission of digital video. Typically codecs are industry standards such as MPEG-2, MPEG-4, H.264/AVC, etc. At the core of all of these standards is the hybrid video coding technique of block motion compensation (prediction) plus transform coding of prediction error. Block motion compensation is used to remove temporal redundancy between successive pictures (frames or fields) by prediction from prior pictures, whereas transform coding is used to remove spatial redundancy within each block.
  • Many block motion compensation schemes basically assume that between successive pictures, i.e., frames, in a video sequence, an object in a scene undergoes a displacement in the x- and y-directions and these displacements define the components of a motion vector. Thus, an object in one picture can be predicted from the object in a prior picture by using the motion vector of the object. To track visual differences from frame-to-frame, each frame is tiled into blocks often referred to as macroblocks. Block-based motion estimation algorithms are used to generate a set of vectors to describe block motion flow between frames, thereby constructing a motion-compensated prediction of a frame. The vectors are determined using block-matching procedures that try to identify the most similar blocks in the current frame with those that have already been encoded in prior frames.
  • Context-adaptive binary arithmetic coding (CABAC) is a form of entropy coding used in video coding such as H.264/MPEG-4 AVC. As such it is an inherently lossless compression technique. It is notable for providing considerably better compression than most other encoding algorithms used in video encoding and is considered one of the primary advantages of the H.264/AVC encoding scheme. CABAC is only supported in Main and higher profiles and requires a considerable amount of processing to decode compared to other similar algorithms. As a result, context-adaptive variable-length coding (CAVLC), a lower efficiency entropy encoding scheme, is sometimes used instead to increase performance on slower playback devices. CABAC achieves 9%-14% better compression compared to CAVLC, with the cost of increased complexity.
  • The theory and operation of CABAC encoding for H.264 is fully defined in the International Telecommunication Union, Telecommunication Standardization Sector (ITU-T) standard “Advanced video coding for generic audiovisual services” H.264, revision 03/2005 or later. General principles are explained in detail in “Context-Based Adaptive Binary Arithmetic Coding in the H.264/AVC Video Compression Standard,” Detlev Marpe, July 2003. In brief, CABAC has multiple probability models for different contexts, i.e., multiple context models. It first converts all non-binary symbols to binary. Then, for each bit, the CABAC coder selects which probability model to use, then uses information from nearby elements to optimize the probability estimate. Arithmetic coding is then applied to compress the data.
  • Efficient coding of syntax element values in a hybrid block-based video coder, such as components of motion vector differences or transform-coefficient level values, can be achieved by employing a binarization scheme as a kind of preprocessing unit for the subsequent stages of context modeling and binary arithmetic coding. In general, a binarization scheme defines a unique mapping of syntax element values to sequences of binary decisions, so-called bins, which can also be interpreted in terms of a binary code tree.
  • By decomposing each syntax element value into a sequence of bins, further processing of each bin value in CABAC depends on the associated coding-mode decision which can be either the regular or the bypass mode. Bypass mode is typically used for bins that are assumed to be uniformly distributed. In the regular coding mode, each bin value is encoded by using the regular binary arithmetic-coding engine, where the associated probability model is either determined by a fixed choice, without any context modeling, or adaptively chosen depending on a related context model. Context models are identified by a context index that is selected from 460 possible values (except High 4:4:4 Intra and High 4:4:4 Predictive profiles). Further, each context model is determined by two parameters, a probability state index (pStateldx) representing the current estimate of the probability of the least probable symbol (LPS) and the binary value of the current most probable symbol (MPS). These two parameters are referred to as context state variables or context variables. Default initial values defined in the H.264 standard are used to initialize the context variables for a context model and the variable values are updated after each bin is encoded.
  • For bypass mode, complexity of the arithmetic coding is significantly reduced. For regular arithmetic coding, encoding of the given bin value depends on the actual state of the associated adaptive probability model that is passed along with the bin value to the multiplication-free Modulo (M) coder, which is a table-based binary arithmetic coding engine used in CABAC. Probability estimation in CABAC is based on a table-driven estimator in which each probability model can take one of 64 different states with associated probability values p ranging in the interval 0.0-0.5. The distinction between the least probable symbol (LPS) and the most probable symbol (MPS) allows each state to be specified by means of the corresponding LPS-related probability pLPS.
  • The use of CABAC can create a performance bottleneck for hardware decoders with the high-definition (HD) video requirements. One possible solution the CABAC throughput problem is to structure the output of an video encoder to create data partitions that are independently decodable by a CABAC decoder in a video decoder. These data partitions could then be decoded in parallel using multiple processors. One suggested partitioning approach for next generation video coding, referred to as entropy slices, is described in “Parallel Entropy Decoding for High Resolution Video Coding,” Jie Zhao and Andrew Segall, SPIE Vol. 7257, 725706-1-725706-11, January, 2009 (“Zhao”).
  • In general, an entropy slice as described in Zhao is similar to the slice concept used in H.264 but it is only applied to entropy coding and decoding. The Zhao entropy slice includes a sequence of entropy encoded macroblocks. Syntax, i.e., an entropy header, is included in the output bit stream of the entropy encoder to identify the start of each entropy slice. Further, each entropy slice is defined such that it can be decoded by a CABAC entropy decoder independent of other entropy slices. Other specific features of the Zhao entropy slice are that in the CABAC entropy decoder, all context models are reinitialized to their initial default states at the beginning of each entropy slice. Further, during decoding, the context state is only updated within an entropy slice, and context model updates are not made across entropy slice boundaries. In addition, macroblocks in other entropy slices are marked as unavailable for the purpose of entropy decoding.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Particular embodiments in accordance with the invention will now be described, by way of example only, and with reference to the accompanying drawings:
  • FIG. 1 shows a block diagram of a video encoding/decoding system in accordance with one or more embodiments of the invention;
  • FIG. 2 shows a block diagram of a video encoder in accordance with one or more embodiments of the invention;
  • FIG. 3 shows a block diagram of a video decoder in accordance with one or more embodiments of the invention;
  • FIG. 4 shows a flow diagram of a method of CABAC encoding in accordance with one or more embodiments of the invention;
  • FIGS. 5 and 6 show flow diagrams of methods of parallel CABAC decoding in accordance with one or more embodiments of the invention; and
  • FIGS. 7-9 show block diagrams of illustrative digital systems in accordance with one or more embodiments of the invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
  • Specific embodiments of the invention will now be described in detail with reference to the accompanying figures. Like elements in the various figures are denoted by like reference numerals for consistency.
  • Certain terms are used throughout the following description and the claims to refer to particular system components. As one skilled in the art will appreciate, components in digital systems may be referred to by different names and/or may be combined in ways not shown herein without departing from the described functionality. This document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . .” Also, the term “couple” and derivatives thereof are intended to mean an indirect, direct, optical, and/or wireless electrical connection. Thus, if a first device couples to a second device, that connection may be through a direct electrical connection, through an indirect electrical connection via other devices and connections, through an optical electrical connection, and/or through a wireless electrical connection.
  • In the following detailed description of embodiments of the invention, numerous specific details are set forth in order to provide a more thorough understanding of the invention. However, it will be apparent to one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the description. In addition, although method steps may be presented and described herein in a sequential fashion, one or more of the steps shown and described may be omitted, repeated, performed concurrently, and/or performed in a different order than the order shown in the figures and/or described herein. Accordingly, embodiments of the invention should not be considered limited to the specific ordering of steps shown in the figures and/or described herein. Further, while various embodiments of the invention are described herein in accordance with the H.264 video coding standard, embodiments for other video coding standards will be understood by one of ordinary skill in the art. Accordingly, embodiments of the invention should not be considered limited to the H.264 video coding standard.
  • In the description herein, some terminology is used that is specifically defined in the H.264 video coding standard and/or is well understood by those of ordinary skill in the art in CABAC coding. Definitions of these terms are not provided in the interest of brevity. Further, this terminology is used for convenience of explanation and should not be considered as limiting embodiments of the invention to the H.264 standard. One of ordinary skill in the art will appreciate that different terminology may be used in other video encoding standards without departing from the described functionality.
  • As previously mentioned, the use of entropy slices has been suggested for enabling parallel entropy decoding in video decoders. However, some studies have shown that using the entropy slices as described in Zhao may cause a bit rate increase in video encoding under some circumstances that diminishes the desirability of using CABAC over other, less complex variable length coding (VLC) techniques. There are two primary reasons for the observed bit rate increase. The first reason is the requirement that the context model probability states used in CABAC are reset to their initial, default states at the beginning of each entropy slice. As the entropy slice size is decreased, the resets of the context models occur more frequently. Frequent resets of the context models to their default states reduce probability model accuracy and impede compression efficiency. The second reason is that the selection of context models for some syntax elements in CABAC, such as motion vector difference, is improved by using information from neighboring macroblocks. The top neighbors of blocks in the upper row of slices are not available. As the size of an entropy slice size is reduced, the percentage of blocks in the top row will increase, thus reducing the accuracy of the context models that rely on information from neighboring macroblocks.
  • Embodiments of the invention provide CABAC encoding and decoding based on entropy slices that may reduce the observed bit rate increases attributed to the above two reasons. More specifically, in some embodiments of the invention, additional information for context model initialization is included in the entropy header of an entropy slice by the CABAC entropy encoder. As is described in more detail below, a CABAC entropy decoder can use this additional information to initialize the state of selected context models rather than resetting the context models to their default initial states. Further, in some embodiments of the invention, the parallel decoding of entropy slices is structured such that information from previously decoded entropy slices may used to estimate the initial context states for context models in subsequent entropy slices.
  • FIG. 1 shows a block diagram of a video encoding/decoding system in accordance with one or more embodiments of the invention. The video encoding/decoding system performs encoding and decoding of digital video sequences using methods for CABAC entropy encoding and decoding as described herein. The system includes a source digital system (100) that transmits encoded video sequences to a destination digital system (102) via a communication channel (116). The source digital system (100) includes a video capture component (104), a video encoder component (106), and a transmitter component (108). The video capture component (104) is configured to provide a video sequence to be encoded by the video encoder component (106). The video capture component (104) may be for example, a video camera, a video archive, or a video feed from a video content provider. The video capture component (104) may generate computer graphics as the video sequence, or a combination of live video and computer-generated video.
  • The video encoder component (106) receives a video sequence from the video capture component (104) and encodes it for transmission by the transmitter component (108). In general, the video encoder component (106) receives the video sequence from the video capture component (104) as a sequence of video frames, divides the frames into coding units which may be a whole frame or a slice of a frame, divides the coding units into blocks of pixels, and encodes the video data in the coding units based on these blocks. The video encoder (106) includes functionality to perform one or more embodiments of methods for CABAC entropy encoding as described herein.
  • The transmitter component (108) transmits the encoded video data to the destination digital system (102) via the communication channel (116). The communication channel (116) may be any communication medium, or combination of communication media suitable for transmission of the encoded video sequence, such as, for example, wired or wireless communication media, a local area network, or a wide area network. The video capture and encoding may take place at a different location and time than the transmission. For example, television programs and movies may be produced, encoded and stored on a disc or other storage devices. The stored movie or program may then be transmitted at a later time.
  • The destination digital system (102) includes a receiver component (110), a video decoder component (112) and a display component (114). The receiver component (110) receives the encoded video data from the source digital system (100) via the communication channel (116) and provides the encoded video data to the video decoder component (112) for decoding. In general, the video decoder component (112) reverses the encoding process performed by the video encoder component (106) to reconstruct the frames of the video sequence. The video decoder component (112) includes functionality to perform one or more embodiments of methods for CABAC entropy decoding as described herein. The reconstructed video sequence may then be displayed on the display component (114). The display component (114) may be any suitable display device such as, for example, a plasma display, a liquid crystal display (LCD), a light emitting diode (LED) display, etc.
  • In some embodiments of the invention, the source digital system (100) may also include a receiver component and a video decoder component and/or the destination digital system (102) may include a transmitter component and a video encoder component for transmission of video sequences both directions for video steaming, video broadcasting, and video telephony. Further, the video encoder component (106) and the video decoder component (112) may perform encoding and decoding in accordance with a video compression standard such as, for example, the Moving Picture Experts Group (MPEG) video compression standards, e.g., MPEG-1, MPEG-2, and MPEG-4, the ITU-T video compressions standards, e.g., H.263 and H.264, the Society of Motion Picture and Television Engineers (SMPTE) 421 M video CODEC standard (commonly referred to as “VC-1”), the video compression standard defined by the Audio Video Coding Standard Workgroup of China (commonly referred to as “AVS”), etc. The video encoder component (106) and the video decoder component (112) may be implemented in any suitable combination of software, firmware, and hardware, such as, for example, one or more digital signal processors (DSPs), microprocessors, discrete logic, application specific integrated circuits (ASICs), etc.
  • FIG. 2 shows a block diagram of a video encoder, e.g., the video encoder (106), in accordance with one or more embodiments of the invention. In the video encoder of FIG. 2, input frames (200) for encoding are provided as one input of a motion estimation component (220), as one input of an intraframe prediction component (224), and to a positive input of a combiner (202) (e.g., adder or subtractor or the like). The frame storage component (218) provides reference data to the motion estimation component (220) and to the motion compensation component (222). The reference data may include one or more previously encoded and decoded frames. The motion estimation component (220) provides motion estimation information to the motion compensation component (222) and the entropy encoders (234). More specifically, the motion estimation component (220) performs tests based on the prediction modes to choose the best motion vector(s)/prediction mode. The motion estimation component (220) provides the selected motion vector (MV) or vectors and the selected prediction mode to the motion compensation component (222) and the selected motion vector (MV) to the entropy encoder component (234).
  • The motion compensation component (222) provides motion compensated prediction information to a selector switch (226) that includes motion compensated interframe prediction macroblocks (MBs). The intraframe prediction component also provides intraframe prediction information to switch (226) that includes intraframe prediction MBs and a prediction mode. That is, similar to the motion estimation component (220), the intraframe prediction component performs tests based on prediction modes to choose the best prediction mode for generating the intraframe prediction MBs.
  • The switch (226) selects between the motion-compensated interframe prediction MBs from the motion compensation component (222) and the intraframe prediction MBs from the intraprediction component (224) based on the selected prediction mode. The output of the switch (226) (i.e., the selected prediction MB) is provided to a negative input of the combiner (202) and to a delay component (230). The output of the delay component (230) is provided to another combiner (i.e., an adder) (238). The combiner (202) subtracts the selected prediction MB from the current MB of the current input frame to provide a residual MB to the transform component (204). The resulting residual MB is a set of pixel difference values that quantify differences between pixel values of the original MB and the prediction MB. The transform component (204) performs a block transform such as DCT, on the residual MB to convert the residual pixel values to transform coefficients and outputs the transform coefficients.
  • The transform coefficients are provided to a quantization component (206) which outputs quantized transform coefficients. Because the DCT transform redistributes the energy of the residual signal into the frequency domain, the quantized transform coefficients are taken out of their raster-scan ordering and arranged by significance, generally beginning with the more significant coefficients followed by the less significant by a scan component (208). The ordered quantized transform coefficients provided via a scan component (208) to the entropy encoder component (234).
  • The entropy encoder component (234) performs entropy encoding on encoded macroblocks to generate a compressed bit stream (236) for transmission or storage. The entropy encoder component (234) may include functionality to perform one or more of any suitable entropy encoding techniques, such as, for example, context adaptive variable length coding (CAVLC), context adaptive binary arithmetic coding (CABAC), run length coding, etc. In one or more embodiments of the invention, the entropy encoder component (234) includes functionality to perform one or embodiments of methods for CABAC entropy encoding as described herein.
  • Inside every encoder is an embedded decoder. As any compliant decoder is expected to reconstruct an image from a compressed bit stream, the embedded decoder provides the same utility to the video encoder. Knowledge of the reconstructed input allows the video encoder to transmit the appropriate residual energy to compose subsequent frames. To determine the reconstructed input, the ordered quantized transform coefficients provided via the scan component (208) are returned to their original post-DCT arrangement by an inverse scan component (210), the output of which is provided to a dequantize component (212), which outputs estimated transformed information, i.e., an estimated or reconstructed version of the transform result from the transform component (204). The estimated transformed information is provided to the inverse transform component (214), which outputs estimated residual information which represents a reconstructed version of the residual MB. The reconstructed residual MB is provided to the combiner (238). The combiner (238) adds the delayed selected predicted MB to the reconstructed residual MB to generate an unfiltered reconstructed MB, which becomes part of reconstructed frame information. The reconstructed frame information is provided via a buffer (228) to the intraframe prediction component (224) and to a filter component (216). The filter component (216) is a deblocking filter which filters the reconstructed frame information and provides filtered reconstructed frames to frame storage component (218).
  • FIG. 3 shows a block diagram of a video decoder, e.g., the video decoder (112), in accordance with one or more embodiments of the invention. In the video decoder of FIG. 3, the entropy decoding component 300 receives an entropy encoded video bit stream and reverses the entropy encoding to recover the encoded macroblocks. The entropy decoding performed by the entropy decoder component (300) may include functionality to perform one or more of any suitable entropy decoding techniques, such as, for example, context adaptive variable length decoding (CAVLC), context adaptive binary arithmetic decoding (CABAC), run length decoding, etc. In one or more embodiments of the invention, the entropy decoder component (300) includes functionality to perform one or embodiments of methods for CABAC entropy decoding as described herein.
  • The inverse scan and dequantization component (302) assembles the macroblocks in the video bit stream in raster scan order and substantially recovers the original frequency domain data. The inverse transform component (304) transforms the frequency domain data from inverse scan and dequantization component (302) back to the spatial domain. This spatial domain data supplies one input of the addition component (306). The other input of addition component (306) comes from the macroblock mode switch (308). When inter-prediction mode is signaled in the encoded video stream, the macroblock mode switch (308) selects the output of the motion compensation component (310). The motion compensation component (310) receives reference frames from frame storage (312) and applies the motion compensation computed by the encoder and transmitted in the encoded video bit stream. When intra-prediction mode is signaled in the encoded video stream, the macroblock mode switch (308) selects the output of the intra-prediction component (314). The intra-prediction component (314) applies the intra-prediction computed by the encoder and transmitted in the encoded video bit stream.
  • The addition component (306) recovers the predicted frame. The output of addition component (306) supplies the input of the deblocking filter component (316). The deblocking filter component (316) smoothes artifacts created by the block and macroblock nature of the encoding process to improve the visual quality of the decoded frame. In one or more embodiments of the invention, the deblocking filter component (316) applies a macroblock-based loop filter for regular decoding to maximize performance and applies a frame-based loop filter for frames encoded using flexible macroblock ordering (FMO) and for frames encoded using arbitrary slice order (ASO). The macroblock-based loop filter is performed after each macroblock is decoded, while the frame-based loop filter delays filtering until all macroblocks in the frame have been decoded.
  • More specifically, because a deblocking filter processes pixels across macroblock boundaries, the neighboring macroblocks are decoded before the filtering is applied. In some embodiments of the invention, performing the loop filter as each macroblock is decoded has the advantage of processing the pixels while they are in on-chip memory, rather than writing out pixels and reading them back in later, which consumes more power and adds delay. However, if macroblocks are decoded out of order, as with FMO or ASO, the pixels from neighboring macroblocks may not be available when the macroblock is decoded; in this case, macroblock-based loop filtering cannot be performed. For FMO or ASO, the loop filtering is delayed until after all macroblocks are decoded for the frame, and the pixels must be reread in a second pass to perform frame-based loop filtering. The output of the deblocking filter component (316) is the decoded frames of the video bit stream. Each decoded frame is stored in frame storage (312) to be used as a reference frame.
  • FIG. 4 shows a flow diagram of a method of CABAC entropy encoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the generation of entropy slices as a part of CABAC entropy encoding. In general, slices of video data are partitioned into one or more entropy slices during CABAC entropy encoding of the syntax element values of the slices. As is well known in the art, a slice may be a subset of macroblocks in a picture (frame) or may the entire picture. As shown in FIG. 4, a syntax element value of a slice is entropy encoded using CABAC (400) to generate a bin string representing the syntax element. Consecutive syntax element values are encoded until either sufficient syntax elements have been encoded to fulfill the size criteria for an entropy slice or the last syntax element value in a slice is encoded (402). The size of an entropy slice may be set, for example, as some number of macroblocks or a maximum number of bits.
  • When sufficient syntax element values have been encoded (or the last syntax element value in a slice has been encoded), an entropy slice header is generated for the entropy slice (404). The entropy slice header includes information identifying the header as that of an entropy slice and context model initialization information to be used by a CABAC entropy decoder to initialize context models prior to decoding the entropy slice. The context model initialization information may be, for example, initial values for context variables or parameters that can be used to calculate the initial values for context variables for selected context models. The generation of the context model initialization information is described in more detail below. The entropy slice header and the entropy encoded syntax element values are then output in a bit stream (406). The process is repeated until all syntax element values in the slice are entropy encoded and included in an entropy slice (408).
  • In one or more embodiments of the invention, the context model initialization information included in the entropy slice header is initialization information for a subset of the context models. The size of the subset may be set to be very small compared to the entire context space and may be chosen as a tradeoff between providing more initialization information to improve the throughput of the entropy decoder and potential bit rate increase due to including initialization information for more context models. The selection of which context models to include in the subset may be based on which of the context models is most frequently used. In other words, context initialization information for the more frequently used context models will be included in the subset in preference to context initialization information for less frequently used context models. For example, coefficient level syntax element bin strings account for a large portion of video bit streams. Therefore, context models corresponding to coefficient levels will be used most frequently in decoding of entropy slices and the corresponding context models may be given preference for inclusion in the subset of context models for which context initialization information is provided in an entropy slice header.
  • In some embodiments of the invention, the context models included in the subset are statically defined. More specifically, when an entropy slice header is generated, the context model initialization information for a fixed, predefined subset of context models is included in the header. The fixed, predefined subset of context models may vary by slice type (I-slice, B-slice, P-slice). For example, the subset of context models for a B-slice or a P-slice may include context models related to motion vectors while the subset for an I-slice may not. The optimal context subsets may be determined empirically by statistical analysis of video sequences at various resolutions, bit rates, and configurations.
  • In some embodiments of the invention, the context models included in the subset are dynamically selected. For example, the entropy encoder may keep track of how frequently each of the possible context models is used while encoding the syntax element values for an entropy slice. The most frequently used context models (based on a threshold) may then be selected for inclusion in the subset of context models for which context initialization information is provided in the entropy slice header. In another example, context models in which the context variables values have deviated significantly (based on a threshold) from their initial default values during the encoding of the syntax element values in the entropy slice may be selected for inclusion in the subset. The maximum size of the subset may be fixed even though the actual context models to be included in the subset may change dynamically. When the context models in the subset are dynamically selected, the context indices for each of the selected context models is included in the entropy slice header as well as the context model initialization information.
  • In one or more embodiments of the invention, the context models included in the subset may be both statically and dynamically defined. For example, a few context models may be always included in the subset while others may be dynamically selected.
  • In one or more embodiments of the invention, the context initialization information included in the entropy slice header may be compressed. For example, the context variable values for a context model may be defined with a fixed number of bits, e.g., seven bits as in H.264. For each context model included in the subset, the difference between the fixed bit value of the default initial context variable values defined for the context model and the fixed bit value of the context variable values after encoding of syntax element values in an entropy slice may be computed and only the difference included in the entropy slice header. In some embodiments of the invention, the difference values are also quantized and compressed using any appropriate known simple entropy encoding technique. As was previously described, if the context models included in the subset are dynamically selected, the context indices for the selected context models are also included in the entropy slice header. The simple entropy encoding technique may also be used to compress the context indices.
  • FIG. 5 shows a flow diagram of a method of parallel CABAC entropy decoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the parallel decoding of entropy slices in a slice as a part of CABAC entropy decoding. In this method, the entropy slice headers include context model initialization information. The method as depicted in FIG. 5 assumes that some number N of entropy slices may be decoded in parallel. The value of the number N depends on, among other things, the processing capabilities of a digital system, i.e., the number of processing units available for parallel processing, on which the method is implemented.
  • As shown in FIG. 5, a bit stream of encoded video data is parsed to identify entropy slice headers in a slice (500). In one or more embodiments of the invention, as each entropy slice header is identified, the decoding of the entropy slice is delegated to one of the N processing units. For example, the delegation may be to an idle processing unit and/or the delegation may take the form of enqueuing the entropy slice for decoding on the next available processing unit. Accordingly, an initial entropy slice in the bit stream can be delegated to a processing unit for decoding and the decoding initiated before the slice header of the next entropy slice is identified and the associated entropy slice is delegated to another processing unit for decoding.
  • Each identified entropy slice is then entropy decoded. For each of the entropy slices, context models are initialized using the context model initialization information in the respective entropy slice header (502 a, 502 b, 502 c). More specifically, the context model initialization information in the entropy slice header is parsed and used to initialize corresponding context models.
  • In some embodiments of the invention, the context model initialization information in the entropy slice header is for a statically defined subset of the context models. In some embodiments of the invention, the context model initialization information in the entropy slice header is for a subset of the context models dynamically selected by the entropy encoder. In such embodiments, the respective entropy slice headers are parsed to determine the context indices of the context models for which context model initialization information is provided. Further, in one or more embodiments of the invention, the context model initialization information and context indices, if present, are compressed as previously described. In such embodiments, the context model initialization information and context indices, if present, are decompressed.
  • In one or more embodiments of the invention, context models that were not included in the subset of context models for which context model initialization information is provided in the entropy slice header are reset to default initial states. In some embodiments of the invention, estimates of context model initialization information of such context models are generated based on the context model initialization information included in the entropy slice header. The estimates may be generated, for example, based on identified correlations between the evolution of probability models in different contexts. Such correlations may be characterized empirically by analysis of CABAC encoding of video sequences at various resolutions, bit rates, and configurations.
  • After initialization of the context models, the entropy encoded syntax element values in the respective entropy slices are decoded (504 a, 504 b, 504 c) and the decoded syntax element values are output (506 a, 506 b, 506 c). The process is repeated until all entropy slices in the slice are decoded (508).
  • FIG. 6 shows a flow diagram of a method of parallel CABAC entropy decoding in accordance with one or more embodiments of the invention. More specifically, the method illustrates the parallel decoding of entropy slices in a slice as a part of CABAC entropy decoding. In some embodiments of the invention, the entropy slice headers do not include context model initialization information as previously described. In other embodiments of the invention, the entropy slice headers do include context model initialization information. The method assumes a limitation M on the maximum number of entropy slices that may be entropy decoded in parallel. This limitation may be set by the video encoder that generated the bit stream to be decoded and communicated in the bit stream.
  • Limiting the maximum number of entropy slices decoded in parallel to M ensures that entropy slices 1 . . . x and an entropy slice x+M are never decoded in parallel. In such a case, an entropy slice x+M can be allowed to have some dependency on one or more of the entropy slices 1 . . . x. For example, the context models for entropy slice x+M can be initialized with estimated values based on the final states of the context models from one or more, i.e., a subset, of entropy slices 1 . . . x. The subset of the M entropy slices on which the context initialization of an entropy slice may depend is referred to as the reference entropy slice subset. In one or more embodiments of the invention, in addition to communicating the maximum number M of entropy slices that may be entropy decoded in parallel, the video encoder also identifies in the entropy slice header of an entropy slice x+M which of the 1 . . . x preceding entropy slices are included in the reference entropy slice subset.
  • The method as depicted in FIG. 6 assumes that N entropy slices may be decoded in parallel. The value of the number N depends on, among other things, the processing capabilities of a digital system, i.e., the number of processing units available for parallel processing, on which the method is implemented. In addition, the value of N is less than or equal to M. That is, even if the digital system has more than M processing units, only M entropy slices are permitted to decoded in parallel.
  • As shown in FIG. 6, a bit stream of encoded video data is parsed to identify entropy slice headers in a slice (600). In one or more embodiments of the invention, as each entropy slice header is identified, the decoding of the entropy slice is delegated to one of the N processing units. For example, the delegation may be to an idle processing unit and/or the delegation may take the form of enqueuing the entropy slice for decoding on the next available processing unit. The delegation may also be scheduled to ensure that the decoding of an entropy slice is not started until the entropy slices in the reference entropy slice subset of the entropy slice are decoded.
  • Each identified entropy slice is then entropy decoded. To entropy decode an entropy slice, the context models are initialized (602 a, 602 b, 602 c). If the entropy slice is not one of the first M entropy slices in the bit stream, the context models are initialized using context model initialization information estimated from the final states of the context models of the entropy slices in the reference entropy slice subset of the entropy slice. The context model initialization information may be estimated, for example, by using the weighted combination of the probability models resulting from decoding the entropy slices in the reference entropy slice subset. Combination weights can be calculated, for example, using the spatial pixel distance in a picture between an entropy slice and the reference entropy slices. If the entropy slice is one of the first M entropy slices, the context models are initialized in some other manner. In some embodiments of the invention, the context models for an entropy slice in the first M entropy slices are initialized using default initialization values. In some embodiments of the invention, the context models for an entropy slice in the first M entropy slices are initialized using context model initialization information from the respective entropy slice headers as previously described.
  • After initialization of the context models, the entropy encoded syntax element values in the entropy slice is decoded (604 a, 604 b, 604 c) and the decoded syntax element values are output (606 a, 606 b, 606 c). The process is repeated until all entropy slices in the slice are decoded (608).
  • FIG. 7 shows a digital system (700) (e.g., a personal computer) that includes a processor (702), associated memory (704), a storage device (706), and numerous other elements and functionalities typical of digital systems (not shown). In one or more embodiments of the invention, a digital system may include multiple processors and/or one or more of the processors may be digital signal processors. The digital system (700) may also include input means, such as a keyboard (708) and a mouse (710) (or other cursor control device), and output means, such as a monitor (712) (or other display device). The digital system (700) may also include an image capture device (not shown) that includes circuitry (e.g., optics, a sensor, readout electronics) for capturing video sequences. The digital system (700) may be connected to a network (714) (e.g., a local area network (LAN), a wide area network (WAN) such as the Internet, a cellular network, any other similar type of network and/or any combination thereof) via a network interface connection 720. Video image data may be received via the network. Those skilled in the art will appreciate that these input and output means may take other forms.
  • The digital system (700) may include a video encoder and/or video decoder with functionality to perform, respectively, embodiments of methods for CABAC encoding and parallel CABAC decoding as described herein. The video decoder may be configured to decode video image data received over a network or from storage media coupled to storage module 706. The digital system (700) may be further configured to display the decoded video data stream, such as a movie or other type of video images, on monitor 712.
  • Further, those skilled in the art will appreciate that one or more elements of the aforementioned digital system (700) may be located at a remote location and connected to the other elements over a network. Further, embodiments of the invention may be implemented on a distributed system having a plurality of nodes, where each portion of the system and software instructions may be located on a different node within the distributed system. In one embodiment of the invention, the node may be a digital system. Alternatively, the node may be a processor with associated physical memory. The node may alternatively be a processor with shared memory and/or resources.
  • Software instructions to perform embodiments of the invention may be stored on a computer readable medium such as a compact disc (CD), a diskette, a tape, a file, or any other computer readable storage device. The software instructions may be distributed to the digital system (700) via removable memory (e.g., floppy disk, optical disk, flash memory, USB key), via a transmission path, etc.
  • FIG. 8 is a block diagram of a digital system (e.g., a mobile cellular telephone) (800) that may be configured to perform the methods described herein. The signal processing unit (SPU) (802) includes a digital processing processor system (DSP) that includes embedded memory and security features. The analog baseband unit (804) receives a voice data stream from handset microphone (813 a) and sends a voice data stream to the handset mono speaker (813 b). The analog baseband unit (804) also receives a voice data stream from the microphone (814 a) and sends a voice data stream to the mono headset (814 b). The analog baseband unit (804) and the SPU (802) may be separate ICs. In many embodiments, the analog baseband unit (804) does not embed a programmable processor core, but performs processing based on configuration of audio paths, filters, gains, etc being setup by software running on the SPU (802). In some embodiments, the analog baseband processing is performed on the same processor and can send information to it for interaction with a user of the digital system (800) during a call processing or other processing.
  • The display (820) may also display pictures and video streams received from the network, from a local camera (828), or from other sources such as the USB (826) or the memory (812). The SPU (802) may also send a video stream to the display (820) that is received from various sources such as the cellular network via the RF transceiver (806) or the camera (828). The SPU (802) may also send a video stream to an external video display unit via the encoder (822) over a composite output terminal (824). The encoder unit (822) may provide encoding according to PAL/SECAM/NTSC video standards.
  • The SPU (802) includes functionality to perform the computational operations required for video compression and decompression. The video compression standards supported may include, for example, one or more of the JPEG standards, the MPEG standards, and the H.26x standards. In one or more embodiments of the invention, the SPU (802) is configured to perform the computational operations of one or more of the methods described herein. Software instructions implementing aspects of the methods may be stored in the memory (812) and executed by the SPU (802) during encoding and decoding of video sequences.
  • FIG. 9 shows a digital system suitable for an embedded system (e.g., a digital camera) in accordance with one or more embodiments of the invention that includes, among other components, a DSP-based image coprocessor (ICP) (902), a RISC processor (904), and a video processing engine (VPE) (906) that may be configured to perform methods for digital image data compression and decompression described herein. The RISC processor (904) may be any suitably configured RISC processor. The VPE (906) includes a configurable video processing front-end (Video FE) (908) input interface used for video capture from imaging peripherals such as image sensors, video decoders, etc., a configurable video processing back-end (Video BE) (910) output interface used for display devices such as SDTV displays, digital LCD panels, HDTV video encoders, etc, and memory interface (924) shared by the Video FE (908) and the Video BE (910). The digital system also includes peripheral interfaces (912) for various peripherals that may include a multi-media card, an audio serial port, a Universal Serial Bus (USB) controller, a serial port interface, etc.
  • The Video FE (908) includes an image signal processor (ISP) (916), and a 3A statistic generator (3A) (918). The ISP (916) provides an interface to image sensors and digital video sources. More specifically, the ISP (916) may accept raw image/video data from a sensor (CMOS or CCD) and can accept YUV video data in numerous formats. The ISP (916) also includes a parameterized image processing module with functionality to generate image data in a color format (e.g., RGB) from raw CCD/CMOS data. The ISP (916) is customizable for each sensor type and supports video frame rates for preview displays of captured digital images and for video recording modes. The ISP (916) also includes, among other functionality, an image resizer, statistics collection functionality, and a boundary signal calculator. The 3A module (918) includes functionality to support control loops for auto focus, auto white balance, and auto exposure by collecting metrics on the raw image data from the ISP (916) or external memory.
  • The Video BE (910) includes an on-screen display engine (OSD) (920) and a video analog encoder (VAC) (922). The OSD engine (920) includes functionality to manage display data in various formats for several different types of hardware display windows and it also handles gathering and blending of video data and display/bitmap data into a single display window before providing the data to the VAC (922) in YCbCr format. The VAC (922) includes functionality to take the display frame from the OSD engine (920) and format it into the desired output format and output signals required to interface to display devices. The VAC (922) may interface to composite NTSC/PAL video devices, S-Video devices, digital LCD devices, high-definition video encoders, DVI/HDMI devices, etc.
  • The memory interface (924) functions as the primary source and sink to modules in the Video FE (908) and the Video BE (910) that are requesting and/or transferring data to/from external memory. The memory interface (924) includes read and write buffers and arbitration logic.
  • The ICP (902) includes functionality to perform the computational operations required for video encoding and decoding and other processing of captured images. The video encoding standards supported may include one or more of the JPEG standards, the MPEG standards, and the H.26x standards. In one or more embodiments of the invention, the ICP (902) is configured to perform the computational operations of embodiments of the methods described herein.
  • In operation, to capture an image or video sequence, video signals are received by the video FE (908) and converted to the input format needed to perform video encoding. The video data generated by the video FE (908) is stored in then stored in external memory. The video data is then encoded by a video encoder. As the video data is encoded, a method for CABAC entropy encoding as described herein is applied. The resulting encoded video data is stored in the external memory. The encoded video data may then read from the external memory, decoded by a video decoder that implements a method for parallel CABAC entropy decoding as described herein, and post-processed by the video BE (910) to display the image/video sequence.
  • While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this disclosure, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as disclosed herein. Accordingly, the scope of the invention should be limited only by the attached claims.
  • Embodiments of the decoders and methods described herein may be provided on any of several types of digital systems: digital signal processors (DSPs), general purpose programmable processors, application specific circuits, or systems on a chip (SoC) such as combinations of a DSP and a reduced instruction set (RISC) processor together with various specialized accelerators. A stored program in an onboard or external (flash EEP) ROM or FRAM may be used to implement aspects of the video signal processing. Analog-to-digital converters and digital-to-analog converters provide coupling to the real world, modulators and demodulators (plus antennas for air interfaces) can provide coupling for waveform reception of video data being broadcast over the air by satellite, TV stations, cellular networks, etc or via wired networks such as the Internet.
  • The techniques described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the software may be executed in one or more processors, such as a microprocessor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), or digital signal processor (DSP). The software that executes the techniques may be initially stored in a computer-readable medium such as compact disc (CD), a diskette, a tape, a file, memory, or any other computer readable storage device and loaded and executed in the processor. In some cases, the software may also be sold in a computer program product, which includes the computer-readable medium and packaging materials for the computer-readable medium. In some cases, the software instructions may be distributed via removable computer readable media (e.g., floppy disk, optical disk, flash memory, USB key), via a transmission path from computer readable media on another digital system, etc.
  • Embodiments of the methods and video decoders for performing parallel bin decoding as described herein may be implemented for virtually any type of digital system (e.g., a desk top computer, a laptop computer, a set-top box for satellite or cable, a handheld device such as a mobile (i.e., cellular) phone, a personal digital assistant, a digital camera, etc.) with functionality to decode digital video images.
  • It is therefore contemplated that the appended claims will cover any such modifications of the embodiments as fall within the true scope and spirit of the invention.

Claims (19)

1. A method of video encoding comprising:
performing context-adaptive binary arithmetic coding (CABAC) on a plurality of syntax element values in a slice to generate a plurality of entropy-encoded syntax element values;
generating an entropy slice header to identify the plurality of entropy-encoded syntax element values as an entropy slice, wherein the entropy slice header comprises context model initialization information; and
outputting the entropy slice header and the plurality of entropy encoded syntax element values.
2. The method of claim 1, wherein the context model initialization information comprises values for setting an initial state of a selected context model.
3. The method of claim 1, wherein the context model initialization information comprises values for setting an initial state of each of a subset of context models selected from a plurality of context models.
4. The method of claim 3, wherein at least one context model included in the subset is empirically selected.
5. The method of claim 3, wherein at least one context model in the subset is dynamically selected based on use of the context model during the CABAC coding of the plurality of syntax element values.
6. The method of claim 5, wherein a context index of the at least one context model in the subset is comprised in the entropy slice header.
7. The method of claim 3, wherein at least one context model in the subset is selected based on a type of the slice.
8. The method of claim 3, wherein generating an entropy slice header comprises compressing the values.
9. A method of video decoding comprising:
identifying a plurality of entropy slice headers in a slice; and
decoding entropy slices corresponding to the plurality of entropy slice headers in parallel, wherein each entropy slice comprises a plurality of entropy-encoded syntax element values encoded using context-adaptive binary arithmetic coding (CABAC), and wherein decoding each entropy slice comprises:
initializing a context model of a plurality of context models using context model initialization information comprised in the entropy slice header corresponding to the entropy slice;
decoding the plurality of entropy-encoded syntax element values comprised in the entropy slice to generate a plurality of syntax element values, wherein the initialized context model is used; and
outputting the plurality of syntax element values.
10. The method of claim 9, wherein the context model initialization information comprises values for setting an initial state of each of a subset of context models selected from the plurality of context models.
11. The method of claim 10, wherein at least one context model included in the subset was empirically selected.
12. The method of claim 10, wherein at least one context model in the subset was dynamically selected based on use of the context model during the CABAC encoding of the plurality of syntax element values.
13. The method of claim 12, wherein a context index of the at least one context model in the subset is comprised in the entropy slice header in the entropy slice header corresponding to the entropy slice.
14. The method of claim 10, wherein at least one context model in the subset was selected based on a type of the slice.
15. The method of claim 9, further comprising decompressing the context model initialization information.
16. A method for video decoding comprising:
identifying a plurality of entropy slice headers in a slice; and
decoding entropy slices corresponding to the plurality of entropy slice headers in parallel, wherein each entropy slice comprises a plurality of entropy-encoded syntax element values encoded using context-adaptive binary arithmetic coding (CABAC), and wherein decoding each entropy slice comprises:
estimating context model initialization information for a context model of a plurality of context models based on decoding of a reference entropy slice subset;
initializing the context model using the estimated context model initialization information;
decoding the plurality of entropy-encoded syntax element values comprised in the entropy slice to generate a plurality of syntax element values, wherein the initialized context model is used; and
outputting the plurality of syntax element values.
17. The method of claim 16, wherein the reference entropy slice subset is identified in the entropy slice header corresponding to the entropy slice.
18. The method of claim 16, wherein the reference entropy slice subset comprises at least one of M sequential entropy slices decoded prior to decoding the entropy slice, wherein M is a maximum number of entropy slices that can be decoded in parallel.
19. The method of claim 18, wherein the value of M is set by a video encoder that generates the plurality of entropy slices.
US12/572,854 2008-10-17 2009-10-02 Parallel CABAC Decoding Using Entropy Slices Abandoned US20100098155A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/572,854 US20100098155A1 (en) 2008-10-17 2009-10-02 Parallel CABAC Decoding Using Entropy Slices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10632308P 2008-10-17 2008-10-17
US12/572,854 US20100098155A1 (en) 2008-10-17 2009-10-02 Parallel CABAC Decoding Using Entropy Slices

Publications (1)

Publication Number Publication Date
US20100098155A1 true US20100098155A1 (en) 2010-04-22

Family

ID=42108234

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/572,217 Active US7932843B2 (en) 2008-10-17 2009-10-01 Parallel CABAC decoding for video decompression
US12/572,854 Abandoned US20100098155A1 (en) 2008-10-17 2009-10-02 Parallel CABAC Decoding Using Entropy Slices
US12/580,876 Active US8068043B2 (en) 2008-10-17 2009-10-16 Method and apparatus for video processing in context-adaptive binary arithmetic coding

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/572,217 Active US7932843B2 (en) 2008-10-17 2009-10-01 Parallel CABAC decoding for video decompression

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/580,876 Active US8068043B2 (en) 2008-10-17 2009-10-16 Method and apparatus for video processing in context-adaptive binary arithmetic coding

Country Status (1)

Country Link
US (3) US7932843B2 (en)

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090245349A1 (en) * 2008-03-28 2009-10-01 Jie Zhao Methods and Systems for Parallel Video Encoding and Decoding
US20100135416A1 (en) * 2008-12-03 2010-06-03 Yu-Wen Huang Method for performing parallel coding with ordered entropy slices, and associated apparatus
US20100322317A1 (en) * 2008-12-08 2010-12-23 Naoki Yoshimatsu Image decoding apparatus and image decoding method
US20110001643A1 (en) * 2009-06-30 2011-01-06 Massachusetts Institute Of Technology System and method for providing high throughput entropy coding using syntax element partitioning
US20110090954A1 (en) * 2009-10-21 2011-04-21 Cohen Robert A Video Codes with Directional Transforms
US20110228858A1 (en) * 2010-03-16 2011-09-22 Madhukar Budagavi CABAC Decoder with Decoupled Arithmetic Decoding and Inverse Binarization
US20110243226A1 (en) * 2010-04-05 2011-10-06 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
CN102263948A (en) * 2010-05-27 2011-11-30 飞思卡尔半导体公司 Video processing system, computer program product and method for decoding an encoded video stream
US20120007992A1 (en) * 2010-07-08 2012-01-12 Texas Instruments Incorporated Method and Apparatus for Sub-Picture Based Raster Scanning Coding Order
US20120014431A1 (en) * 2010-07-14 2012-01-19 Jie Zhao Methods and Systems for Parallel Video Encoding and Parallel Video Decoding
US20120082218A1 (en) * 2010-10-01 2012-04-05 Kiran Misra Methods and Systems for Entropy Coder Initialization
US20120201294A1 (en) * 2009-10-14 2012-08-09 Segall Christopher A Methods for parallel video encoding and decoding
US20120224643A1 (en) * 2011-03-04 2012-09-06 Vixs Systems, Inc. Video decoder with pipeline processing and methods for use therewith
US20120230397A1 (en) * 2011-03-10 2012-09-13 Canon Kabushiki Kaisha Method and device for encoding image data, and method and device for decoding image data
US20120281754A1 (en) * 2010-01-06 2012-11-08 Sony Corporation Device and method for processing image
US8344917B2 (en) 2010-09-30 2013-01-01 Sharp Laboratories Of America, Inc. Methods and systems for context initialization in video coding and decoding
US20130003830A1 (en) * 2011-06-30 2013-01-03 Kiran Misra Context initialization based on decoder picture buffer
US20130003827A1 (en) * 2011-06-30 2013-01-03 Kiran Misra Context initialization based on decoder picture buffer
US20130058407A1 (en) * 2011-03-08 2013-03-07 Qualcomm Incorporated Coding of transform coefficients for video coding
EP2533538A3 (en) * 2011-06-10 2013-03-20 Research In Motion Limited Method and system to reduce modelling overhead for data compression
US20130083858A1 (en) * 2010-05-24 2013-04-04 Nec Corporation Video image delivery system, video image transmission device, video image delivery method, and video image delivery program
US20130101047A1 (en) * 2011-10-19 2013-04-25 Sony Corporation Context reduction of significance map coding of 4x4 and 8x8 transform coefficient in hm4.0
US20130114671A1 (en) * 2011-11-08 2013-05-09 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US20130142253A1 (en) * 2011-02-24 2013-06-06 Hisao Sasai Arithmetic decoding method and arithmetic coding method
US20130188700A1 (en) * 2012-01-19 2013-07-25 Qualcomm Incorporated Context adaptive entropy coding with a reduced initialization value set
US20130202051A1 (en) * 2012-02-02 2013-08-08 Texas Instruments Incorporated Sub-Pictures for Pixel Rate Balancing on Multi-Core Platforms
EP2575366A3 (en) * 2011-09-27 2013-09-11 Broadcom Corporation Signaling of coding unit prediction and prediction unit partition mode for video coding
US20140016700A1 (en) * 2011-03-07 2014-01-16 Orange Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
CN103563375A (en) * 2011-06-27 2014-02-05 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
CN103583048A (en) * 2011-06-30 2014-02-12 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
CN103609113A (en) * 2011-06-23 2014-02-26 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
US20140086325A1 (en) * 2012-09-27 2014-03-27 Qualcomm Incorporated Scalable extensions to hevc and temporal motion vector prediction
US20140105293A1 (en) * 2011-07-15 2014-04-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Sample array coding for low-delay
WO2014107065A1 (en) * 2013-01-04 2014-07-10 삼성전자 주식회사 Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
US20140341306A1 (en) * 2012-02-04 2014-11-20 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US20150023409A1 (en) * 2012-04-13 2015-01-22 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Low delay picture coding
US9042440B2 (en) 2010-12-03 2015-05-26 Qualcomm Incorporated Coding the position of a last significant coefficient within a video block based on a scanning order for the block in video coding
US20150195538A1 (en) * 2011-06-24 2015-07-09 Dolby International Ab Method for Encoding and Decoding Images, Encoding and Decoding Device, and Corresponding Computer Programs
US20150237367A1 (en) * 2008-11-11 2015-08-20 Samsung Electronics Co., Ltd. Moving picture encoding/decoding apparatus and method for processing of moving picture divided in units of slices
US20150264331A1 (en) * 2009-04-07 2015-09-17 Lg Electronics Inc. Broadcast transmitter, broadcast receiver and 3d video data processing method thereof
US9167253B2 (en) 2011-06-28 2015-10-20 Qualcomm Incorporated Derivation of the position in scan order of the last significant transform coefficient in video coding
US20150334418A1 (en) * 2012-12-27 2015-11-19 Nippon Telegraph And Telephone Corporation Image encoding method, image decoding method, image encoding apparatus, image decoding apparatus, image encoding program, and image decoding program
US9197890B2 (en) 2011-03-08 2015-11-24 Qualcomm Incorporated Harmonized scan order for coding transform coefficients in video coding
US20150365917A1 (en) * 2010-11-23 2015-12-17 Lg Electronics Inc. Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, and broadcast signal transceiving method in broadcast signal transmitting and receiving apparatuses
US9264727B2 (en) 2011-06-29 2016-02-16 Panasonic Intellectual Property Corporation Of America Image decoding method including determining a context for a current block according to a signal type under which a control parameter for the current block is classified
US9271002B2 (en) 2011-06-24 2016-02-23 Panasonic Intellectual Property Corporation Of America Coding method and coding apparatus
US9282330B1 (en) * 2011-07-13 2016-03-08 Google Inc. Method and apparatus for data compression using content-based features
US9363525B2 (en) 2011-06-28 2016-06-07 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9413387B2 (en) * 2014-09-19 2016-08-09 Imagination Technologies Limited Data compression using entropy encoding
US20160234501A1 (en) * 2015-02-11 2016-08-11 Futurewei Technologies, Inc. Apparatus and Method for Compressing Color Index Map
US9462282B2 (en) 2011-07-11 2016-10-04 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9484952B2 (en) 2011-11-03 2016-11-01 Qualcomm Incorporated Context state and probability initialization for context adaptive entropy coding
US9549221B2 (en) * 2013-12-26 2017-01-17 Sony Corporation Signal switching apparatus and method for controlling operation thereof
US9769635B2 (en) 2010-11-23 2017-09-19 Lg Electronics Inc. Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, and broadcast signal transceiving method in broadcasting signal transmitting and receiving apparatuses
US9788078B2 (en) * 2014-03-25 2017-10-10 Samsung Electronics Co., Ltd. Enhanced distortion signaling for MMT assets and ISOBMFF with improved MMT QoS descriptor having multiple QoE operating points
US9794578B2 (en) 2011-06-24 2017-10-17 Sun Patent Trust Coding method and coding apparatus
US9973781B2 (en) 2012-06-29 2018-05-15 Ge Video Compression, Llc Video data stream concept
US10033406B2 (en) 2008-12-03 2018-07-24 Hfi Innovation Inc. Method for performing parallel coding with ordered entropy slices, and associated apparatus
US10049427B1 (en) * 2017-08-15 2018-08-14 Apple Inc. Image data high throughput predictive compression systems and methods
USRE47366E1 (en) 2011-06-23 2019-04-23 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
US10306238B2 (en) * 2013-04-16 2019-05-28 Fastvdo Llc Adaptive coding, transmission and efficient display of multimedia (ACTED)
US10439637B2 (en) 2011-06-30 2019-10-08 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10455244B2 (en) * 2014-11-14 2019-10-22 Lg Electronics Inc. Method and device for entropy encoding or entropy decoding video signal for high-capacity parallel processing
US20220132180A1 (en) * 2011-09-14 2022-04-28 Tivo Corporation Fragment server directed device fragment caching
US11330272B2 (en) 2010-12-22 2022-05-10 Qualcomm Incorporated Using a most probable scanning order to efficiently code scanning order information for a video block in video coding
US11647197B2 (en) 2011-06-30 2023-05-09 Velos Media, Llc Context initialization based on slice header flag and slice type
US11895293B2 (en) 2010-09-24 2024-02-06 Sun Patent Trust Image coding method, image decoding method, image coding apparatus, and image decoding apparatus
US11997319B2 (en) 2012-01-20 2024-05-28 Ge Video Compression, Llc Coding concept allowing parallel processing, transport demultiplexer and video bitstream
US12267618B2 (en) * 2020-06-23 2025-04-01 Huawei Technologies Co., Ltd. Video transmission method, apparatus, and system
US20250159103A1 (en) * 2023-11-10 2025-05-15 Canon Kabushiki Kaisha Communication apparatus, method of controlling communication apparatus, and storage medium

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8576918B2 (en) * 2007-07-09 2013-11-05 Broadcom Corporation Method and apparatus for signaling and decoding AVS1-P2 bitstreams of different versions
US9602821B2 (en) * 2008-10-01 2017-03-21 Nvidia Corporation Slice ordering for video encoding
TWI428023B (en) * 2008-11-18 2014-02-21 Ind Tech Res Inst Decoding method and device
BR112012009446B1 (en) 2009-10-20 2023-03-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V DATA STORAGE METHOD AND DEVICE
BR112012017256B1 (en) 2010-01-12 2021-08-31 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V Audio encoder, audio decoder, encoding method and audio information and method of decoding an audio information using a hash table describing both significant state values and range boundaries
US8319672B2 (en) * 2010-04-09 2012-11-27 Korea Electronics Technology Institute Decoding device for context-based adaptive binary arithmetic coding (CABAC) technique
US20120014433A1 (en) * 2010-07-15 2012-01-19 Qualcomm Incorporated Entropy coding of bins across bin groups using variable length codewords
US10349070B2 (en) 2010-09-30 2019-07-09 Texas Instruments Incorporated Simplified binary arithmetic coding engine
US8902988B2 (en) * 2010-10-01 2014-12-02 Qualcomm Incorporated Zero-out of high frequency coefficients and entropy coding retained coefficients using a joint context model
WO2012048055A1 (en) 2010-10-05 2012-04-12 General Instrument Corporation Coding and decoding utilizing adaptive context model selection with zigzag scan
US8976861B2 (en) 2010-12-03 2015-03-10 Qualcomm Incorporated Separately coding the position of a last significant coefficient of a video block in video coding
US8913662B2 (en) * 2011-01-06 2014-12-16 Qualcomm Incorporated Indicating intra-prediction mode selection for video coding using CABAC
US8938001B1 (en) * 2011-04-05 2015-01-20 Google Inc. Apparatus and method for coding using combinations
US8446301B2 (en) 2011-04-15 2013-05-21 Research In Motion Limited Methods and devices for coding and decoding the position of the last significant coefficient
MY170940A (en) 2011-06-16 2019-09-19 Ge Video Compression Llc Entropy coding of motion vector differences
AU2011203174B2 (en) * 2011-06-29 2014-09-04 Canon Kabushiki Kaisha Method, apparatus and system for determining the value of a coefficient flag
UA114674C2 (en) 2011-07-15 2017-07-10 ДЖ.І. ВІДІЕУ КЕМПРЕШН, ЛЛСі CONTEXT INITIALIZATION IN ENTHROPIC CODING
RU2601167C2 (en) * 2011-07-18 2016-10-27 Сан Пэтент Траст Image encoding method, image decoding method, image encoding device, image decoding device and image encoding and decoding device
US9756360B2 (en) 2011-07-19 2017-09-05 Qualcomm Incorporated Coefficient scanning in video coding
US8891616B1 (en) 2011-07-27 2014-11-18 Google Inc. Method and apparatus for entropy encoding based on encoding cost
US9578336B2 (en) 2011-08-31 2017-02-21 Texas Instruments Incorporated Hybrid video and graphics system with automatic content detection process, and other circuits, processes, and systems
PL3754982T3 (en) 2011-09-29 2024-09-02 Sharp Kabushiki Kaisha Image decoding device, image decoding method, image encoding method and image encoding device for performing bi-prediction to uni-prediction conversion
CN103858430B (en) * 2011-09-29 2017-05-03 夏普株式会社 Image decoding apparatus, image decoding method and image encoding apparatus
US9088796B2 (en) 2011-11-07 2015-07-21 Sharp Kabushiki Kaisha Video decoder with enhanced CABAC decoding
US20130114667A1 (en) * 2011-11-08 2013-05-09 Sony Corporation Binarisation of last position for higher throughput
WO2013074088A1 (en) * 2011-11-15 2013-05-23 Intel Corporation Video encoder with 2-bin per clock cabac encoding
US9247257B1 (en) 2011-11-30 2016-01-26 Google Inc. Segmentation based entropy encoding and decoding
US9503717B2 (en) 2012-01-09 2016-11-22 Texas Instruments Incorporated Context adaptive binary arithmetic coding (CABAC) with scalable throughput and coding efficiency
US11039138B1 (en) 2012-03-08 2021-06-15 Google Llc Adaptive coding of prediction modes using probability distributions
WO2013140722A1 (en) * 2012-03-21 2013-09-26 パナソニック株式会社 Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding/decoding device
US9681133B2 (en) 2012-03-29 2017-06-13 Intel Corporation Two bins per clock CABAC decoding
US10129540B2 (en) 2012-04-10 2018-11-13 Texas Instruments Incorporated Reduced complexity coefficient transmission for adaptive loop filtering (ALF) in video coding
US9942571B2 (en) 2012-05-29 2018-04-10 Hfi Innovations Inc. Method and apparatus for coding of sample adaptive offset information
US9774856B1 (en) 2012-07-02 2017-09-26 Google Inc. Adaptive stochastic entropy coding
US9712817B1 (en) 2013-02-15 2017-07-18 Apple Inc. Lossless video coding systems and methods
US9509998B1 (en) 2013-04-04 2016-11-29 Google Inc. Conditional predictive multi-symbol run-length coding
US9392292B2 (en) 2013-09-27 2016-07-12 Apple Inc. Parallel encoding of bypass binary symbols in CABAC encoder
US9351003B2 (en) * 2013-09-27 2016-05-24 Apple Inc. Context re-mapping in CABAC encoder
US9392288B2 (en) 2013-10-17 2016-07-12 Google Inc. Video coding using scatter-based scan tables
US9179151B2 (en) 2013-10-18 2015-11-03 Google Inc. Spatial proximity context entropy coding
US9455743B2 (en) * 2014-05-27 2016-09-27 Qualcomm Incorporated Dedicated arithmetic encoding instruction
EP3269141B1 (en) * 2015-05-19 2021-06-23 MediaTek Inc. Method and apparatus for multi-table based context adaptive binary arithmetic coding
US20170064321A1 (en) * 2015-08-27 2017-03-02 Stmicroelectronics International N.V. System and method for decoding a video digital data stream using a table of range values and probable symbols
CN112929659B (en) * 2015-10-13 2023-12-26 三星电子株式会社 Methods and devices for encoding or decoding images
US10264264B2 (en) * 2016-09-24 2019-04-16 Apple Inc. Multi-bin decoding systems and methods
US10694202B2 (en) * 2016-12-01 2020-06-23 Qualcomm Incorporated Indication of bilateral filter usage in video coding
US9992496B1 (en) * 2017-11-15 2018-06-05 Google Llc Bin string coding based on a most probable symbol
WO2019140083A1 (en) 2018-01-12 2019-07-18 Futurewei Technologies, Inc. Adaptive multi-hypothesis context-adaptive binary arithmetic coding (mcabac)
CN110324625B (en) * 2018-03-31 2023-04-18 华为技术有限公司 Video decoding method and device
US10922026B2 (en) 2018-11-01 2021-02-16 Fungible, Inc. Data processing unit having hardware-based range encoding and decoding
US10511324B1 (en) * 2018-11-01 2019-12-17 Fungible, Inc. Data processing unit having hardware-based range encoding and decoding
US10778980B2 (en) * 2018-11-14 2020-09-15 Mediatek Inc. Entropy decoding apparatus with context pre-fetch and miss handling and associated entropy decoding method
CN110058932B (en) * 2019-04-19 2021-08-27 中国科学院深圳先进技术研究院 Storage method and storage system for data stream driving calculation
CN111181568A (en) * 2020-01-10 2020-05-19 深圳花果公社商业服务有限公司 Data compression device and method, data decompression device and method
FR3157037A1 (en) * 2023-12-19 2025-06-20 Stmicroelectronics International N.V. COMPUTER SYSTEM AND METHOD FOR ARITHMETIC DECODING

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233240A1 (en) * 2005-04-13 2006-10-19 Samsung Electronics Co., Ltd. Context-based adaptive arithmetic coding and decoding methods and apparatuses with improved coding efficiency and video coding and decoding methods and apparatuses using the same
US20090245349A1 (en) * 2008-03-28 2009-10-01 Jie Zhao Methods and Systems for Parallel Video Encoding and Decoding

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3132456B2 (en) * 1998-03-05 2001-02-05 日本電気株式会社 Hierarchical image coding method and hierarchical image decoding method
WO2003092169A1 (en) * 2002-04-26 2003-11-06 Ntt Docomo, Inc. Signal encoding method, signal decoding method, signal encoding device, signal decoding device, signal encoding program, and signal decoding program
US6900748B2 (en) * 2003-07-17 2005-05-31 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and apparatus for binarization and arithmetic coding of a data value
US20060104521A1 (en) * 2004-11-15 2006-05-18 Shu-Wen Teng Image processing devices and methods
US20060104351A1 (en) * 2004-11-15 2006-05-18 Shu-Wen Teng Video/image processing devices and methods
US7242328B1 (en) * 2006-02-03 2007-07-10 Cisco Technology, Inc. Variable length coding for sparse coefficients
JP2007300517A (en) * 2006-05-02 2007-11-15 Sony Corp Moving image processing method, moving image processing method program, recording medium storing moving image processing method program, and moving image processing apparatus
US8306125B2 (en) * 2006-06-21 2012-11-06 Digital Video Systems, Inc. 2-bin parallel decoder for advanced video processing
US7525459B2 (en) * 2007-04-19 2009-04-28 Analog Devices, Inc. Simplified programmable compute system for executing an H.264 binary decode symbol instruction
US7710296B2 (en) 2007-09-19 2010-05-04 Texas Instruments Incorporated N-bin arithmetic coding for context adaptive binary arithmetic coding
US7522076B1 (en) * 2007-10-16 2009-04-21 Mediatek Inc. Parallel context adaptive binary arithmetic coding
JP4893657B2 (en) * 2008-02-29 2012-03-07 ソニー株式会社 Arithmetic decoding device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233240A1 (en) * 2005-04-13 2006-10-19 Samsung Electronics Co., Ltd. Context-based adaptive arithmetic coding and decoding methods and apparatuses with improved coding efficiency and video coding and decoding methods and apparatuses using the same
US20090245349A1 (en) * 2008-03-28 2009-10-01 Jie Zhao Methods and Systems for Parallel Video Encoding and Decoding

Cited By (298)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9503745B2 (en) 2008-03-28 2016-11-22 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US9681143B2 (en) 2008-03-28 2017-06-13 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US10652585B2 (en) 2008-03-28 2020-05-12 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US20140241438A1 (en) 2008-03-28 2014-08-28 Sharp Kabushiki Kaisha Methods, devices and systems for parallel video encoding and decoding
US8542748B2 (en) 2008-03-28 2013-09-24 Sharp Laboratories Of America, Inc. Methods and systems for parallel video encoding and decoding
US20110026604A1 (en) * 2008-03-28 2011-02-03 Jie Zhao Methods, devices and systems for parallel video encoding and decoding
US11438634B2 (en) 2008-03-28 2022-09-06 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US12231699B2 (en) 2008-03-28 2025-02-18 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US10958943B2 (en) 2008-03-28 2021-03-23 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US9681144B2 (en) 2008-03-28 2017-06-13 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US20100027680A1 (en) * 2008-03-28 2010-02-04 Segall Christopher A Methods and Systems for Parallel Video Encoding and Decoding
US9930369B2 (en) 2008-03-28 2018-03-27 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US9473772B2 (en) 2008-03-28 2016-10-18 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US8824541B2 (en) * 2008-03-28 2014-09-02 Sharp Kabushiki Kaisha Methods, devices and systems for parallel video encoding and decoding
US10484720B2 (en) * 2008-03-28 2019-11-19 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US10284881B2 (en) 2008-03-28 2019-05-07 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US20090245349A1 (en) * 2008-03-28 2009-10-01 Jie Zhao Methods and Systems for Parallel Video Encoding and Decoding
US11838558B2 (en) 2008-03-28 2023-12-05 Dolby International Ab Methods, devices and systems for parallel video encoding and decoding
US20150237368A1 (en) * 2008-11-11 2015-08-20 Samsung Electronics Co., Ltd. Moving picture encoding/decoding apparatus and method for processing of moving picture divided in units of slices
US20150237367A1 (en) * 2008-11-11 2015-08-20 Samsung Electronics Co., Ltd. Moving picture encoding/decoding apparatus and method for processing of moving picture divided in units of slices
US9432687B2 (en) 2008-11-11 2016-08-30 Samsung Electronics Co., Ltd. Moving picture encoding/decoding apparatus and method for processing of moving picture divided in units of slices
US20150237369A1 (en) * 2008-11-11 2015-08-20 Samsung Electronics Co., Ltd. Moving picture encoding/decoding apparatus and method for processing of moving picture divided in units of slices
US9467699B2 (en) * 2008-12-03 2016-10-11 Hfi Innovation Inc. Method for performing parallel coding with ordered entropy slices, and associated apparatus
US10033406B2 (en) 2008-12-03 2018-07-24 Hfi Innovation Inc. Method for performing parallel coding with ordered entropy slices, and associated apparatus
US20100135416A1 (en) * 2008-12-03 2010-06-03 Yu-Wen Huang Method for performing parallel coding with ordered entropy slices, and associated apparatus
US20100322317A1 (en) * 2008-12-08 2010-12-23 Naoki Yoshimatsu Image decoding apparatus and image decoding method
US20150264331A1 (en) * 2009-04-07 2015-09-17 Lg Electronics Inc. Broadcast transmitter, broadcast receiver and 3d video data processing method thereof
US10129525B2 (en) 2009-04-07 2018-11-13 Lg Electronics Inc. Broadcast transmitter, broadcast receiver and 3D video data processing method thereof
US9762885B2 (en) 2009-04-07 2017-09-12 Lg Electronics Inc. Broadcast transmitter, broadcast receiver and 3D video data processing method thereof
US9756311B2 (en) * 2009-04-07 2017-09-05 Lg Electronics Inc. Broadcast transmitter, broadcast receiver and 3D video data processing method thereof
US8294603B2 (en) * 2009-06-30 2012-10-23 Massachusetts Institute Of Technology System and method for providing high throughput entropy coding using syntax element partitioning
US20110001643A1 (en) * 2009-06-30 2011-01-06 Massachusetts Institute Of Technology System and method for providing high throughput entropy coding using syntax element partitioning
US20120201294A1 (en) * 2009-10-14 2012-08-09 Segall Christopher A Methods for parallel video encoding and decoding
US20110090954A1 (en) * 2009-10-21 2011-04-21 Cohen Robert A Video Codes with Directional Transforms
US20120281754A1 (en) * 2010-01-06 2012-11-08 Sony Corporation Device and method for processing image
US9973768B2 (en) 2010-03-16 2018-05-15 Texas Instruments Incorporated CABAC decoder with decoupled arithmetic decoding and inverse binarization
US10944979B2 (en) 2010-03-16 2021-03-09 Texas Instruments Incorporated CABAC decoder with decoupled arithmetic decoding and inverse binarization
US12425619B2 (en) 2010-03-16 2025-09-23 Texas Instruments Incorporated CABAC decoder with decoupled arithmetic decoding and inverse binarization
US20110228858A1 (en) * 2010-03-16 2011-09-22 Madhukar Budagavi CABAC Decoder with Decoupled Arithmetic Decoding and Inverse Binarization
US11843794B2 (en) 2010-03-16 2023-12-12 Texas Instruments Incorporated CABAC decoder with decoupled arithmetic decoding and inverse binarization
US11546620B2 (en) 2010-03-16 2023-01-03 Texas Instruments Incorporated CABAC decoder with decoupled arithmetic decoding and inverse binarization
US20160261891A1 (en) * 2010-04-05 2016-09-08 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US9369736B2 (en) * 2010-04-05 2016-06-14 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US9866875B2 (en) 2010-04-05 2018-01-09 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US9602845B2 (en) * 2010-04-05 2017-03-21 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US20110243226A1 (en) * 2010-04-05 2011-10-06 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US10027991B2 (en) 2010-04-05 2018-07-17 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US10158890B2 (en) 2010-04-05 2018-12-18 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US20130083858A1 (en) * 2010-05-24 2013-04-04 Nec Corporation Video image delivery system, video image transmission device, video image delivery method, and video image delivery program
CN102263948A (en) * 2010-05-27 2011-11-30 飞思卡尔半导体公司 Video processing system, computer program product and method for decoding an encoded video stream
US20110293019A1 (en) * 2010-05-27 2011-12-01 Freescale Semiconductor Inc. Video processing system, computer program product and method for decoding an encoded video stream
US10623741B2 (en) 2010-07-08 2020-04-14 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US11425383B2 (en) 2010-07-08 2022-08-23 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US8988531B2 (en) * 2010-07-08 2015-03-24 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US11800109B2 (en) * 2010-07-08 2023-10-24 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US20120007992A1 (en) * 2010-07-08 2012-01-12 Texas Instruments Incorporated Method and Apparatus for Sub-Picture Based Raster Scanning Coding Order
US20240048707A1 (en) * 2010-07-08 2024-02-08 Texas Instruments Incorporated Sub-picture based raster scanning coding order
US12192465B2 (en) * 2010-07-08 2025-01-07 Texas Instruments Incorporated Sub-picture based raster scanning coding order
US10939113B2 (en) 2010-07-08 2021-03-02 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US10574992B2 (en) 2010-07-08 2020-02-25 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US10110901B2 (en) 2010-07-08 2018-10-23 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US20220329808A1 (en) * 2010-07-08 2022-10-13 Texas Instruments Incorporated Method and apparatus for sub-picture based raster scanning coding order
US20120014431A1 (en) * 2010-07-14 2012-01-19 Jie Zhao Methods and Systems for Parallel Video Encoding and Parallel Video Decoding
US12294698B2 (en) 2010-09-24 2025-05-06 Sun Patent Trust Image coding method, image decoding method, image coding apparatus, and image decoding apparatus
US11895293B2 (en) 2010-09-24 2024-02-06 Sun Patent Trust Image coding method, image decoding method, image coding apparatus, and image decoding apparatus
US8344917B2 (en) 2010-09-30 2013-01-01 Sharp Laboratories Of America, Inc. Methods and systems for context initialization in video coding and decoding
US10341662B2 (en) 2010-10-01 2019-07-02 Velos Media, Llc Methods and systems for entropy coder initialization
US20120082218A1 (en) * 2010-10-01 2012-04-05 Kiran Misra Methods and Systems for Entropy Coder Initialization
US10999579B2 (en) 2010-10-01 2021-05-04 Velos Media, Llc Methods and systems for decoding a video bitstream
US9313514B2 (en) * 2010-10-01 2016-04-12 Sharp Kabushiki Kaisha Methods and systems for entropy coder initialization
US10659786B2 (en) 2010-10-01 2020-05-19 Velos Media, Llc Methods and systems for decoding a video bitstream
US9756610B2 (en) * 2010-11-23 2017-09-05 Lg Electronics Inc. Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, and broadcast signal transceiving method in broadcast signal transmitting and receiving apparatuses
US20150365917A1 (en) * 2010-11-23 2015-12-17 Lg Electronics Inc. Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, and broadcast signal transceiving method in broadcast signal transmitting and receiving apparatuses
US9769635B2 (en) 2010-11-23 2017-09-19 Lg Electronics Inc. Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, and broadcast signal transceiving method in broadcasting signal transmitting and receiving apparatuses
US9055290B2 (en) 2010-12-03 2015-06-09 Qualcomm Incorporated Coding the position of a last significant coefficient within a video block based on a scanning order for the block in video coding
US9042440B2 (en) 2010-12-03 2015-05-26 Qualcomm Incorporated Coding the position of a last significant coefficient within a video block based on a scanning order for the block in video coding
US11330272B2 (en) 2010-12-22 2022-05-10 Qualcomm Incorporated Using a most probable scanning order to efficiently code scanning order information for a video block in video coding
US9930336B2 (en) * 2011-02-24 2018-03-27 Sun Patent Trust Arithmetic decoding method and arithmetic coding method
US9319675B2 (en) * 2011-02-24 2016-04-19 Panasonic Intellectual Property Corporation Of America Arithmetic decoding method and arithmetic coding method
US10462463B2 (en) 2011-02-24 2019-10-29 Sun Patent Trust Arithmetic decoding method and arithmetic coding method
US20130142253A1 (en) * 2011-02-24 2013-06-06 Hisao Sasai Arithmetic decoding method and arithmetic coding method
US10757414B2 (en) 2011-02-24 2020-08-25 Sun Patent Trust Arithmetic decoding method and arithmetic coding method
US9247261B2 (en) * 2011-03-04 2016-01-26 Vixs Systems, Inc. Video decoder with pipeline processing and methods for use therewith
US20120224643A1 (en) * 2011-03-04 2012-09-06 Vixs Systems, Inc. Video decoder with pipeline processing and methods for use therewith
US11736723B2 (en) 2011-03-07 2023-08-22 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US11343535B2 (en) 2011-03-07 2022-05-24 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US10382784B2 (en) 2011-03-07 2019-08-13 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9560380B2 (en) * 2011-03-07 2017-01-31 Dolby International Ab Coding and decoding images using probability data
US10681376B2 (en) 2011-03-07 2020-06-09 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9628818B2 (en) 2011-03-07 2017-04-18 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US12177478B2 (en) 2011-03-07 2024-12-24 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US20140016700A1 (en) * 2011-03-07 2014-01-16 Orange Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9271012B2 (en) 2011-03-07 2016-02-23 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9338449B2 (en) 2011-03-08 2016-05-10 Qualcomm Incorporated Harmonized scan order for coding transform coefficients in video coding
US10499059B2 (en) 2011-03-08 2019-12-03 Velos Media, Llc Coding of transform coefficients for video coding
US9197890B2 (en) 2011-03-08 2015-11-24 Qualcomm Incorporated Harmonized scan order for coding transform coefficients in video coding
US10397577B2 (en) 2011-03-08 2019-08-27 Velos Media, Llc Inverse scan order for significance map coding of transform coefficients in video coding
US20130058407A1 (en) * 2011-03-08 2013-03-07 Qualcomm Incorporated Coding of transform coefficients for video coding
US9106913B2 (en) * 2011-03-08 2015-08-11 Qualcomm Incorporated Coding of transform coefficients for video coding
US11006114B2 (en) 2011-03-08 2021-05-11 Velos Media, Llc Coding of transform coefficients for video coding
US11405616B2 (en) 2011-03-08 2022-08-02 Qualcomm Incorporated Coding of transform coefficients for video coding
US9445114B2 (en) * 2011-03-10 2016-09-13 Canon Kabushiki Kaisha Method and device for determining slice boundaries based on multiple video encoding processes
US20120230397A1 (en) * 2011-03-10 2012-09-13 Canon Kabushiki Kaisha Method and device for encoding image data, and method and device for decoding image data
EP2533538A3 (en) * 2011-06-10 2013-03-20 Research In Motion Limited Method and system to reduce modelling overhead for data compression
EP4404558A3 (en) * 2011-06-23 2024-10-23 Sun Patent Trust Image decoding device, image encoding device
USRE49906E1 (en) 2011-06-23 2024-04-02 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
EP2725791A4 (en) * 2011-06-23 2014-12-03 Panasonic Ip Corp America IMAGE DECODING METHOD, IMAGE ENCODING METHOD, IMAGE ENCODING DEVICE, IMAGE DECODING DEVICE, AND IMAGE ENCODING / DECODING DEVICE
EP4228264A1 (en) * 2011-06-23 2023-08-16 Sun Patent Trust Image decoding device, image encoding device
USRE47366E1 (en) 2011-06-23 2019-04-23 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
CN106878731A (en) * 2011-06-23 2017-06-20 太阳专利托管公司 Image encoding method and image encoding device
USRE48810E1 (en) 2011-06-23 2021-11-02 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
USRE47537E1 (en) 2011-06-23 2019-07-23 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
USRE47547E1 (en) 2011-06-23 2019-07-30 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
CN103609113A (en) * 2011-06-23 2014-02-26 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
US9635361B2 (en) 2011-06-24 2017-04-25 Sun Patent Trust Decoding method and decoding apparatus
US10033999B2 (en) 2011-06-24 2018-07-24 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9654783B2 (en) 2011-06-24 2017-05-16 Dolby International Ab Method for encoding and decoding images, encoding and decoding device, and corresponding computer programs
US10200696B2 (en) 2011-06-24 2019-02-05 Sun Patent Trust Coding method and coding apparatus
US9661335B2 (en) 2011-06-24 2017-05-23 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US10182246B2 (en) 2011-06-24 2019-01-15 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9271002B2 (en) 2011-06-24 2016-02-23 Panasonic Intellectual Property Corporation Of America Coding method and coding apparatus
US10638164B2 (en) 2011-06-24 2020-04-28 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US11758158B2 (en) 2011-06-24 2023-09-12 Sun Patent Trust Coding method and coding apparatus
US9319693B2 (en) * 2011-06-24 2016-04-19 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US11109043B2 (en) 2011-06-24 2021-08-31 Sun Patent Trust Coding method and coding apparatus
US9319692B2 (en) 2011-06-24 2016-04-19 Dolby International Ab Method for encoding and decoding images, encoding and decoding device, and corresponding computer programs
US12231656B2 (en) 2011-06-24 2025-02-18 Sun Patent Trust Coding method and coding apparatus
US9380308B2 (en) 2011-06-24 2016-06-28 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US20150195538A1 (en) * 2011-06-24 2015-07-09 Dolby International Ab Method for Encoding and Decoding Images, Encoding and Decoding Device, and Corresponding Computer Programs
US12273524B2 (en) * 2011-06-24 2025-04-08 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US11457225B2 (en) 2011-06-24 2022-09-27 Sun Patent Trust Coding method and coding apparatus
US10694186B2 (en) 2011-06-24 2020-06-23 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9794578B2 (en) 2011-06-24 2017-10-17 Sun Patent Trust Coding method and coding apparatus
US20150195537A1 (en) * 2011-06-24 2015-07-09 Dolby International Ab Method of Coding and Decoding Images, Coding and Decoding Device and Computer Programs Corresponding Thereto
US20230353740A1 (en) * 2011-06-24 2023-11-02 Dolby International Ab Method of Coding and Decoding Images, Coding and Decoding Device and Computer Programs Corresponding Thereto
US9319694B2 (en) * 2011-06-24 2016-04-19 Dolby International Ab Method for encoding and decoding images, encoding and decoding device, and corresponding computer programs
US9848196B2 (en) 2011-06-24 2017-12-19 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US10362311B2 (en) 2011-06-24 2019-07-23 Dolby International Ab Method of coding and decoding images, coding and decoding device and computer programs corresponding thereto
US9591311B2 (en) 2011-06-27 2017-03-07 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9912961B2 (en) 2011-06-27 2018-03-06 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
CN103563375A (en) * 2011-06-27 2014-02-05 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
RU2608244C2 (en) * 2011-06-27 2017-01-17 Сан Пэтент Траст Image encoding method, image decoding method, image encoding device, image decoding device and apparatus for encoding and decoding images
US10687074B2 (en) 2011-06-27 2020-06-16 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9154783B2 (en) 2011-06-27 2015-10-06 Panasonic Intellectual Property Corporation Of America Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
CN106878715A (en) * 2011-06-27 2017-06-20 太阳专利托管公司 Coding method and code device
US9363525B2 (en) 2011-06-28 2016-06-07 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10750184B2 (en) 2011-06-28 2020-08-18 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10154264B2 (en) 2011-06-28 2018-12-11 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9167253B2 (en) 2011-06-28 2015-10-20 Qualcomm Incorporated Derivation of the position in scan order of the last significant transform coefficient in video coding
US10237579B2 (en) 2011-06-29 2019-03-19 Sun Patent Trust Image decoding method including determining a context for a current block according to a signal type under which a control parameter for the current block is classified
US9264727B2 (en) 2011-06-29 2016-02-16 Panasonic Intellectual Property Corporation Of America Image decoding method including determining a context for a current block according to a signal type under which a control parameter for the current block is classified
US10652584B2 (en) 2011-06-29 2020-05-12 Sun Patent Trust Image decoding method including determining a context for a current block according to a signal type under which a control parameter for the current block is classified
US11647197B2 (en) 2011-06-30 2023-05-09 Velos Media, Llc Context initialization based on slice header flag and slice type
CN103583048A (en) * 2011-06-30 2014-02-12 松下电器产业株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
RU2645270C1 (en) * 2011-06-30 2018-02-19 Вилос Медиа Интернэшнл Лимитед Context initialization based on buffer of decoder pictures
US11973950B2 (en) 2011-06-30 2024-04-30 Velos Media, Llc Context initialization based on slice header flag and slice type
US10439637B2 (en) 2011-06-30 2019-10-08 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9338465B2 (en) * 2011-06-30 2016-05-10 Sharp Kabushiki Kaisha Context initialization based on decoder picture buffer
US10595022B2 (en) 2011-06-30 2020-03-17 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US11412226B2 (en) 2011-06-30 2022-08-09 Velos Media, Llc Context initialization based on slice header flag and slice type
US12489902B2 (en) 2011-06-30 2025-12-02 Velos Media, Llc Context initialization based on slice header flag and slice type
US9794571B2 (en) 2011-06-30 2017-10-17 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10531091B2 (en) * 2011-06-30 2020-01-07 Velos Media, Llc Context initialization based on slice header flag and slice type
US10903848B2 (en) 2011-06-30 2021-01-26 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10382760B2 (en) 2011-06-30 2019-08-13 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10931951B2 (en) 2011-06-30 2021-02-23 Velos Media, Llc Context initialization based on slice header flag and slice type
US9060173B2 (en) * 2011-06-30 2015-06-16 Sharp Kabushiki Kaisha Context initialization based on decoder picture buffer
US10165277B2 (en) 2011-06-30 2018-12-25 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US20130003827A1 (en) * 2011-06-30 2013-01-03 Kiran Misra Context initialization based on decoder picture buffer
US9154780B2 (en) 2011-06-30 2015-10-06 Panasonic Intellectual Property Corporation Of America Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US12212753B2 (en) 2011-06-30 2025-01-28 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10205948B2 (en) 2011-06-30 2019-02-12 Velos Media, Llc Context initialization based on slice header flag and slice type
US20130003830A1 (en) * 2011-06-30 2013-01-03 Kiran Misra Context initialization based on decoder picture buffer
US9525881B2 (en) 2011-06-30 2016-12-20 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US12375675B2 (en) 2011-06-30 2025-07-29 Velos Media, Llc Context initialization based on slice header flag and slice type
US11356666B2 (en) 2011-06-30 2022-06-07 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
RU2597523C2 (en) * 2011-06-30 2016-09-10 Шарп Кабусики Кайся Context initialisation based on decoder picture buffer
US11792400B2 (en) 2011-06-30 2023-10-17 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
EP2728869A4 (en) * 2011-06-30 2014-12-03 Panasonic Ip Corp America IMAGE DECODING METHOD, IMAGE ENCODING METHOD, IMAGE DECODING DEVICE, IMAGE ENCODING DEVICE, AND IMAGE ENCODING / DECODING DEVICE
US20190208207A1 (en) * 2011-06-30 2019-07-04 Velos Media, LLC> Context initialization based on slice header flag and slice type
US11343518B2 (en) 2011-07-11 2022-05-24 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9462282B2 (en) 2011-07-11 2016-10-04 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10154270B2 (en) 2011-07-11 2018-12-11 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US10575003B2 (en) 2011-07-11 2020-02-25 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US12108059B2 (en) 2011-07-11 2024-10-01 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9854257B2 (en) 2011-07-11 2017-12-26 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US11770544B2 (en) 2011-07-11 2023-09-26 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US9282330B1 (en) * 2011-07-13 2016-03-08 Google Inc. Method and apparatus for data compression using content-based features
US10085035B2 (en) 2011-07-15 2018-09-25 Ge Video Compression, Llc Sample array coding for low-delay
US9596469B2 (en) * 2011-07-15 2017-03-14 Ge Video Compression, Llc Sample array coding for low-delay
US11019352B2 (en) * 2011-07-15 2021-05-25 Ge Video Compression, Llc Sample array coding for low-delay
AU2016200182B2 (en) * 2011-07-15 2017-05-04 Dolby Video Compression, Llc Sample array coding for low-delay
US11595675B2 (en) * 2011-07-15 2023-02-28 Ge Video Compression, Llc Sample array coding for low-delay
US20140105293A1 (en) * 2011-07-15 2014-04-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Sample array coding for low-delay
US9729891B2 (en) 2011-07-15 2017-08-08 Ge Video Compression, Llc Sample array coding for low-delay
US10924754B2 (en) 2011-07-15 2021-02-16 Ge Video Compression, Llc Sample array coding for low-delay based on position information
US20240292010A1 (en) * 2011-07-15 2024-08-29 Ge Video Compression, Llc Sample array coding for low-delay
US10771800B2 (en) 2011-07-15 2020-09-08 Ge Video Compression, Llc Sample array coding for low-delay
US11949897B2 (en) 2011-07-15 2024-04-02 Ge Video Compression, Llc Sample array coding for low-delay
CN107959854A (en) * 2011-07-15 2018-04-24 Ge视频压缩有限责任公司 Decoder and method, encoder and method and storage medium
US10652564B2 (en) * 2011-07-15 2020-05-12 Ge Video Compression, Llc Sample array coding for low-delay
CN103797793A (en) * 2011-07-15 2014-05-14 弗兰霍菲尔运输应用研究公司 Low latency sample array encoding
US10659798B2 (en) * 2011-07-15 2020-05-19 Ge Video Compression, Llc Sample array coding for low-delay
US9860544B2 (en) 2011-07-15 2018-01-02 Ge Video Compression, Llc Sample array coding for low-delay
US9860547B2 (en) 2011-07-15 2018-01-02 Ge Video Compression, Llc Sample array coding for low-delay
US12389024B2 (en) * 2011-07-15 2025-08-12 Dolby Video Compression, Llc Sample array coding for low-delay
US9866857B2 (en) 2011-07-15 2018-01-09 Ge Video Compression, Llc Sample array coding for low-delay
US12052450B2 (en) * 2011-09-14 2024-07-30 Tivo Corporation Fragment server directed device fragment caching
US20240015343A1 (en) * 2011-09-14 2024-01-11 Tivo Corporation Fragment server directed device fragment caching
US20250133127A1 (en) * 2011-09-14 2025-04-24 Adeia Media Holdings Llc Fragment server directed device fragment caching
US11743519B2 (en) * 2011-09-14 2023-08-29 Tivo Corporation Fragment server directed device fragment caching
US20220132180A1 (en) * 2011-09-14 2022-04-28 Tivo Corporation Fragment server directed device fragment caching
US9332283B2 (en) 2011-09-27 2016-05-03 Broadcom Corporation Signaling of prediction size unit in accordance with video coding
EP2575366A3 (en) * 2011-09-27 2013-09-11 Broadcom Corporation Signaling of coding unit prediction and prediction unit partition mode for video coding
US20130101047A1 (en) * 2011-10-19 2013-04-25 Sony Corporation Context reduction of significance map coding of 4x4 and 8x8 transform coefficient in hm4.0
US9484952B2 (en) 2011-11-03 2016-11-01 Qualcomm Incorporated Context state and probability initialization for context adaptive entropy coding
US9277241B2 (en) 2011-11-08 2016-03-01 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US9237358B2 (en) * 2011-11-08 2016-01-12 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US9451287B2 (en) 2011-11-08 2016-09-20 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US9288508B2 (en) 2011-11-08 2016-03-15 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US20130114671A1 (en) * 2011-11-08 2013-05-09 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
US9172976B2 (en) 2011-11-08 2015-10-27 Qualcomm Incorporated Context reduction for context adaptive binary arithmetic coding
CN104067524A (en) * 2012-01-19 2014-09-24 高通股份有限公司 Context adaptive entropy coding with a reduced initialization value set
CN104067524B (en) * 2012-01-19 2017-06-13 高通股份有限公司 Context-adaptive entropy coding with the initialization value set simplified
US9654772B2 (en) * 2012-01-19 2017-05-16 Qualcomm Incorporated Context adaptive entropy coding with a reduced initialization value set
US20130188700A1 (en) * 2012-01-19 2013-07-25 Qualcomm Incorporated Context adaptive entropy coding with a reduced initialization value set
US11997319B2 (en) 2012-01-20 2024-05-28 Ge Video Compression, Llc Coding concept allowing parallel processing, transport demultiplexer and video bitstream
US10893284B2 (en) 2012-02-02 2021-01-12 Texas Instruments Incorporated Sub-pictures for pixel rate balancing on multi-core platforms
US20130202051A1 (en) * 2012-02-02 2013-08-08 Texas Instruments Incorporated Sub-Pictures for Pixel Rate Balancing on Multi-Core Platforms
US10244246B2 (en) * 2012-02-02 2019-03-26 Texas Instruments Incorporated Sub-pictures for pixel rate balancing on multi-core platforms
US12470727B2 (en) 2012-02-02 2025-11-11 Texas Instruments Incorporated Sub-pictures for pixel rate balancing
US11350117B2 (en) 2012-02-02 2022-05-31 Texas Instruments Incorporated Sub-pictures for pixel rate balancing on multicore platforms
US11758163B2 (en) 2012-02-02 2023-09-12 Texas Instruments Incorporated Sub-pictures for pixel rate balancing on multi-core platforms
US10681364B2 (en) * 2012-02-04 2020-06-09 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US12120332B2 (en) * 2012-02-04 2024-10-15 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US11778212B2 (en) * 2012-02-04 2023-10-03 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US9106930B2 (en) * 2012-02-04 2015-08-11 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US10091520B2 (en) * 2012-02-04 2018-10-02 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US20220053203A1 (en) * 2012-02-04 2022-02-17 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US20140341306A1 (en) * 2012-02-04 2014-11-20 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US9635386B2 (en) 2012-02-04 2017-04-25 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US11218713B2 (en) * 2012-02-04 2022-01-04 Lg Electronics Inc. Video encoding method, video decoding method, and device using same
US20220264127A1 (en) * 2012-04-13 2022-08-18 Ge Video Compression, Llc Low delay picture coding
US12192492B2 (en) * 2012-04-13 2025-01-07 Ge Video Compression, Llc Low delay picture coding
AU2022201459B2 (en) * 2012-04-13 2022-08-18 Dolby Video Compression, Llc Low delay picture coding
US11343517B2 (en) * 2012-04-13 2022-05-24 Ge Video Compression, Llc Low delay picture coding
US11259034B2 (en) 2012-04-13 2022-02-22 Ge Video Compression, Llc Scalable data stream and network entity
TWI752680B (en) * 2012-04-13 2022-01-11 美商Ge影像壓縮有限公司 Decoder and method for reconstructing a picture from a datastream, encoder and method for coding a picture into a datastream, and related computer program and machine accessible medium
US11122278B2 (en) * 2012-04-13 2021-09-14 Ge Video Compression, Llc Low delay picture coding
US12495150B2 (en) 2012-04-13 2025-12-09 Dolby Video Compression, Llc Scalable data stream and network entity
AU2021201682B2 (en) * 2012-04-13 2022-07-07 Dolby Video Compression, Llc Low delay picture coding
AU2019202551B2 (en) * 2012-04-13 2020-12-17 Dolby Video Compression, Llc Low delay picture coding
TWI711298B (en) * 2012-04-13 2020-11-21 美商Ge影像壓縮有限公司 Decoder and method for reconstructing a picture from a datastream, encoder and method for coding a picture into a datastream, and related computer program and machine accessible medium
TWI586179B (en) * 2012-04-13 2017-06-01 Ge影像壓縮有限公司 Decoder and method for reconstructing images from data stream, encoder and method for encoding images into data stream, and related computer programs and machine-accessible media
US10694198B2 (en) 2012-04-13 2020-06-23 Ge Video Compression, Llc Scalable data stream and network entity
US10674164B2 (en) * 2012-04-13 2020-06-02 Ge Video Compression, Llc Low delay picture coding
TWI860819B (en) * 2012-04-13 2024-11-01 美商Ge影像壓縮有限公司 Decoder and method for decoding information from a datastream to reconstruct a picture, encoder for encoding a picture into a datastream, and related non-transitory computer-readable medium
TWI816249B (en) * 2012-04-13 2023-09-21 美商Ge影像壓縮有限公司 Decoder and method for reconstructing a picture from a datastream, encoder and method for coding a picture into a datastream, and related computer program and machine accessible medium
US10045017B2 (en) 2012-04-13 2018-08-07 Ge Video Compression, Llc Scalable data stream and network entity
US11876985B2 (en) 2012-04-13 2024-01-16 Ge Video Compression, Llc Scalable data stream and network entity
TWI634794B (en) * 2012-04-13 2018-09-01 Ge影像壓縮有限公司 Decoder and method for reconstructing images from data stream, encoder and method for encoding images into data stream, and related computer programs and machine-accessible media
US20150023409A1 (en) * 2012-04-13 2015-01-22 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Low delay picture coding
US10123006B2 (en) * 2012-04-13 2018-11-06 Ge Video Compression, Llc Low delay picture coding
US20190045201A1 (en) * 2012-04-13 2019-02-07 Ge Video Compression, Llc Low delay picture coding
US11856229B2 (en) 2012-06-29 2023-12-26 Ge Video Compression, Llc Video data stream concept
US11025958B2 (en) 2012-06-29 2021-06-01 Ge Video Compression, Llc Video data stream concept
US10484716B2 (en) 2012-06-29 2019-11-19 Ge Video Compression, Llc Video data stream concept
US9973781B2 (en) 2012-06-29 2018-05-15 Ge Video Compression, Llc Video data stream concept
US11956472B2 (en) 2012-06-29 2024-04-09 Ge Video Compression, Llc Video data stream concept
US10743030B2 (en) 2012-06-29 2020-08-11 Ge Video Compression, Llc Video data stream concept
US9491461B2 (en) * 2012-09-27 2016-11-08 Qualcomm Incorporated Scalable extensions to HEVC and temporal motion vector prediction
US20140086325A1 (en) * 2012-09-27 2014-03-27 Qualcomm Incorporated Scalable extensions to hevc and temporal motion vector prediction
US20150334418A1 (en) * 2012-12-27 2015-11-19 Nippon Telegraph And Telephone Corporation Image encoding method, image decoding method, image encoding apparatus, image decoding apparatus, image encoding program, and image decoding program
US9924197B2 (en) * 2012-12-27 2018-03-20 Nippon Telegraph And Telephone Corporation Image encoding method, image decoding method, image encoding apparatus, image decoding apparatus, image encoding program, and image decoding program
WO2014107065A1 (en) * 2013-01-04 2014-07-10 삼성전자 주식회사 Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
RU2609750C1 (en) * 2013-01-04 2017-02-02 Самсунг Электроникс Ко., Лтд. Method of entropy encoding slice segment and device therefor and method of entropy decoding segment slice and device therefor
US9826253B2 (en) 2013-01-04 2017-11-21 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
US9826254B2 (en) 2013-01-04 2017-11-21 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
AU2014204151B2 (en) * 2013-01-04 2016-11-03 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
US9866873B2 (en) 2013-01-04 2018-01-09 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
US10271071B2 (en) 2013-01-04 2019-04-23 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
CN105357525A (en) * 2013-01-04 2016-02-24 三星电子株式会社 Video decoding method and device
US9877049B2 (en) 2013-01-04 2018-01-23 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
RU2608263C1 (en) * 2013-01-04 2017-01-17 Самсунг Электроникс Ко., Лтд. Method of entropy encoding slice segment and device therefor and method of entropy decoding segment slice and device therefor
RU2609749C1 (en) * 2013-01-04 2017-02-02 Самсунг Электроникс Ко., Лтд. Method of entropy encoding slice segment and device therefor and method of entropy decoding segment slice and device therefor
US9866874B2 (en) 2013-01-04 2018-01-09 Samsung Electronics Co., Ltd. Method for entropy-encoding slice segment and apparatus therefor, and method for entropy-decoding slice segment and apparatus therefor
RU2608353C1 (en) * 2013-01-04 2017-01-18 Самсунг Электроникс Ко., Лтд. Method for entropy coding slice segment and device therefor, and method for entropy decoding slice segment and device therefor
RU2608465C1 (en) * 2013-01-04 2017-01-18 Самсунг Электроникс Ко., Лтд. Method of entropy encoding slice segment and device therefor and method of entropy decoding segment slice and device therefor
US10306238B2 (en) * 2013-04-16 2019-05-28 Fastvdo Llc Adaptive coding, transmission and efficient display of multimedia (ACTED)
US9549221B2 (en) * 2013-12-26 2017-01-17 Sony Corporation Signal switching apparatus and method for controlling operation thereof
US9788078B2 (en) * 2014-03-25 2017-10-10 Samsung Electronics Co., Ltd. Enhanced distortion signaling for MMT assets and ISOBMFF with improved MMT QoS descriptor having multiple QoE operating points
US9413387B2 (en) * 2014-09-19 2016-08-09 Imagination Technologies Limited Data compression using entropy encoding
US10455244B2 (en) * 2014-11-14 2019-10-22 Lg Electronics Inc. Method and device for entropy encoding or entropy decoding video signal for high-capacity parallel processing
US9729885B2 (en) * 2015-02-11 2017-08-08 Futurewei Technologies, Inc. Apparatus and method for compressing color index map
US20160234501A1 (en) * 2015-02-11 2016-08-11 Futurewei Technologies, Inc. Apparatus and Method for Compressing Color Index Map
US10049427B1 (en) * 2017-08-15 2018-08-14 Apple Inc. Image data high throughput predictive compression systems and methods
US12267618B2 (en) * 2020-06-23 2025-04-01 Huawei Technologies Co., Ltd. Video transmission method, apparatus, and system
US20250159103A1 (en) * 2023-11-10 2025-05-15 Canon Kabushiki Kaisha Communication apparatus, method of controlling communication apparatus, and storage medium

Also Published As

Publication number Publication date
US7932843B2 (en) 2011-04-26
US8068043B2 (en) 2011-11-29
US20100097250A1 (en) 2010-04-22
US20100097248A1 (en) 2010-04-22

Similar Documents

Publication Publication Date Title
US11843794B2 (en) CABAC decoder with decoupled arithmetic decoding and inverse binarization
US20100098155A1 (en) Parallel CABAC Decoding Using Entropy Slices
US20230421808A1 (en) Line-based compression for digital image data
US12375694B2 (en) Rate control in video coding
US8588536B2 (en) Guaranteed-rate tiled image data compression
US8615043B2 (en) Fixed length coding based image data compression
US8160136B2 (en) Probabilistic bit-rate and rate-distortion cost estimation for video coding
US8885714B2 (en) Method and system for intracoding in video encoding
US9083984B2 (en) Adaptive coding structure and adaptive FCode determination in video coding
US9161058B2 (en) Method and system for detecting global brightness change for weighted prediction in video encoding
US20110255597A1 (en) Method and System for Reducing Flicker Artifacts
US10291913B2 (en) Entropy encoder, video encoder including the same and electronic system including the same
US20100118948A1 (en) Method and apparatus for video processing using macroblock mode refinement
US8363722B2 (en) Method and apparatus for hierarchical bi-directional intra-prediction in a video encoder
US20110142135A1 (en) Adaptive Use of Quarter-Pel Motion Compensation

Legal Events

Date Code Title Description
AS Assignment

Owner name: TEXAS INSTRUMENTS INCORPORATED,TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DEMIRCIN, MEHMET UMUT;BUDAGAVI, MADHUKAR;REEL/FRAME:023322/0213

Effective date: 20091002

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION