US20070147517A1 - Video processing system capable of error resilience and video processing method for same - Google Patents
Video processing system capable of error resilience and video processing method for same Download PDFInfo
- Publication number
- US20070147517A1 US20070147517A1 US11/464,845 US46484506A US2007147517A1 US 20070147517 A1 US20070147517 A1 US 20070147517A1 US 46484506 A US46484506 A US 46484506A US 2007147517 A1 US2007147517 A1 US 2007147517A1
- Authority
- US
- United States
- Prior art keywords
- frame
- time
- decoded
- decoding
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 59
- 238000012545 processing Methods 0.000 title description 17
- 238000000034 method Methods 0.000 claims description 37
- 238000012805 post-processing Methods 0.000 claims description 26
- 230000005236 sound signal Effects 0.000 claims description 23
- 238000004590 computer program Methods 0.000 claims 6
- 230000006870 function Effects 0.000 description 26
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/18—Error detection or correction; Testing, e.g. of drop-outs
- G11B20/1833—Error detection or correction; Testing, e.g. of drop-outs by adding special lists or symbols to the coded information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/132—Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/156—Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/587—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
- H04N19/61—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B7/00—Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
- G11B7/004—Recording, reproducing or erasing methods; Read, write or erase circuits therefor
- G11B7/0045—Recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B7/00—Recording or reproducing by optical means, e.g. recording using a thermal beam of optical radiation by modifying optical properties or the physical structure, reproducing using an optical beam at lower power by sensing optical properties; Record carriers therefor
- G11B7/004—Recording, reproducing or erasing methods; Read, write or erase circuits therefor
- G11B7/005—Reproducing
Definitions
- the present invention relates to video processing, and in particular to a video processing system capable of error resilience and a video processing method for same.
- MPEG-4 is an ISO/IEC standard developed by the MPEG (Moving Picture Experts Group), the committee that also developed the Emmy Award winning standards MPEG-1 and MPEG-2.
- MPEG-4 is the successor to MPEG-1 and MPEG-2 video standards.
- the ISO standards committee implemented this standard in 1998. Instead of the current frame-based video technology, MPEG-4, has adopted the object-oriented concept.
- the object-oriented concept integrates existing multimedia technologies, such as 2D and 3D graphics, animation, video codecs, multimedia streaming, interactivity, and programmatic environments into a single architecture.
- Video processing based on MPEG-4 format images is described in the following.
- a video stream is a sequence of video frames. Each frame is a still image. A video player displays individual frames sequentially, typically at a rate close to 30 frames per second. Frames are divided into 16 ⁇ 16 pixel macroblocks (not shown). Each macroblock consists of four 8 ⁇ 8 luminance blocks and two 8 ⁇ 8 chrominance blocks (1 U and 1 V) (not shown). Macroblocks are units for measuring motion-compensated compression. Blocks are used for DCT compression.
- Video data complying with MPEG-4 format files is composed of three different types of frames, comprising intra-frames (I-frames), forward predicted frames (P-frames), and bidirectional predicted frames (B-frames).
- I-frame is encoded as a single image, with no reference to any past or future frames, referring to the fact that the various lossless and lossy compression techniques are performed on information contained only within a current frame, and not relative to any other frame in the video sequence. In other words, no temporal processing is performed outside the current picture or frame.
- a P-frame is encoded relative to the past reference frame.
- a reference frame is a P- or an I-frame. The past reference frame is the closest preceding reference frame.
- Each macroblock in a P-frame can be encoded either as an Intra-macroblock or as an Inter-macroblock.
- intra-macroblocks are encoded in the same manner as macroblocks.
- a B-frame is encoded relative to the past reference frame, the future reference frame, or both frames.
- the future reference frame is the closest following reference frame (I or P).
- the encoding for B-frames is similar to P-frames, except that motion vectors may refer to areas in the future reference frames. For macroblocks that use both past and future reference frames, the two 16 ⁇ 16 areas are averaged.
- a video stream is a sequence of video frames.
- An input encoded sequence of video frames for example, is represented as “I(0) P(3) B(1) B(2) P(6) B(4) B(5) I(9) B(7) B(8) P(12) B(11) P(12)” (the bracketed number is the display order of each frame).
- the output decoded sequence of the video frames is then represented as “I(0) B(1) B(2) P(3) B(4) B(5) P(6) B(7) B(8) I(9) B(10) B(11) P(12)”.
- Video data can be further composed of combined frames, each comprising both a P-frame and a B-frame or an I-frame and a B-frame, and empty frames (Pe).
- FIG. 1 which shows a schematic diagram of a combined frame 100 (regarded as one frame to be introduced in the video data), comprising a P-frame and a B-frame, encoding processes of each frame are identical to independent ones.
- Pe frames indicate predicted frames not comprising video data.
- FIG. 2 is a schematic view of a video system.
- Video system 100 comprises storage medium 110 , a file system 120 , a file parser 130 , a video decoder 140 , a first Post-processing unit 145 , an audio decoder 150 , a second Post-processing unit 155 , an AV synchronization unit 160 , and a display system 170 .
- Display system 170 is comprises a multiple display function.
- AV files are first stored in storage medium 110 and file system 120 accesses AV files therefrom.
- file parser 130 parses the AV files to retrieve video and audio signals and pass the signals to video decoder 140 and audio decoder 150 respectively.
- Video decoder 140 and audio decoder 150 decode the video and audio signals and
- Post-processing units 145 and 155 implement post-processing operations, such as deblocking or deringing, to adjust the quality of video and audio signals respectively.
- the adjusted signals are transmitted to AV synchronization unit 160 for signal synchronization and are output to display system 170 to display the AV files using first display device 171 or both first display device 171 and second display device 173 .
- Playback quality is limited in that video files may not play smoothly in a video system comprising restricted memory resources, central processing unit (CPU) speed, or system architecture.
- Resource limitations are described in the following.
- High bit rate files (bit per second (bps)) must be accessed by file systems or storage media of adequate operating speed.
- Higher frame rate or a fast forward function for video decoding requires relatively better hardware performance.
- Post-processing such as deblocking and deringing methods, can improve video quality but employs more system resources. Additionally, greater memory bandwidth in liquid crystal displays and TVs is required if larger frame size is requested. As described, higher frame rate, larger frame size, higher bit rate, increased post-processing, or more display elements require high-grade systems for high performance. Limitations of embedded systems included blurry video images and failure of audio signal playback.
- a video processing method for AV synchronization resilience is provided.
- a first frame is decoded and displayed while a second frame is decoded for a predetermined period of time. It is determined whether the decoding time for the second frame exceeds a defined duration. If so, the first frame is continuously displayed during the decoding time for the second frame, and the second frame is displayed while a third frame is decoded for a predetermined period of time.
- a video processing method for dynamic frame dropping is provided.
- a first frame is decoded and displayed while a second frame is decoded. It is determined whether the display time for the first frame exceeds a defined duration. If the display time for the first frame exceeds the defined duration, the second frame is dropped, and the first frame is continuously displayed while a third frame is decoded for a predetermined period of time.
- a video processing method for dynamic frame dropping is provided.
- a plurality of frames is provided. It is determined whether a portion of the frames is selectively ignored, and, if so, the frames are displayed at a predetermined number of frame intervals.
- a video processing method for auto pause/resume, applied to a file system/storage medium is provided.
- a first frame is decoded and displayed for a predetermined period of time while a second frame is decoded. It is determined whether the access speed of the file system/storage medium is not adequate enough is adequate for processing the second frame. If the access speed of the file system/storage medium is not adequate enough to process the second frame, an auto pause function is enabled for a predetermined period of time and audio signals are disabled while the second frame is ignored and a third frame is decoded. The auto pause function is disabled, and an auto resume function and the audio signals are enabled.
- a video processing method capable of reducing system resources utilized for fast forward is provided. It is determined whether AV files are played with N multiples. If the AV files are played with N multiples, audio signals of the AV files are disabled. Video signals of the AV files are played with the N multiples. Post-processing is disabled and negligible frames of the AV files are not displayed by a display system.
- a video processing method for frame skipping for fast forward is provided.
- Post-processing for frames of AV files is disabled.
- a first frame is decoded and displayed while a second frame is decoded for a predetermined period of time. It is determined whether decoding time for the second frame exceeds a defined duration. If the decoding time for the second frame exceeds the defined duration, the first frame is continuously displayed during the decoding time for the second frame, and the second frame is not displayed while a third frame is decoded for a predetermined period of time.
- FIG. 1 is a schematic view of a combined frame comprising a P-frame and a B-frame;
- FIG. 2 is a schematic view of a video system
- FIG. 3 is a workflow of an embodiment of a video processing method for AV synchronization resilience
- FIG. 4 is a flowchart of an embodiment of a video processing method for AV synchronization resilience
- FIG. 5 is a workflow of an embodiment of a video processing method for dynamic frame dropping for a single display device
- FIG. 6 is a workflow of an embodiment of a video processing method for dynamic frame dropping for at least one first display device and one second display device;
- FIG. 7 is a flowchart of an embodiment of a video processing method for dynamic frame dropping
- FIG. 8 is a workflow of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed
- FIG. 9 is a flowchart of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed having insufficient file system/storage media access speed;
- FIG. 10 is a workflow of an embodiment of a video processing method for an auto pause/resume function with insufficient decoding capability
- FIG. 11 is a flowchart of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed
- FIG. 12 is a workflow of an embodiment of a video processing method for reducing system resources utilized for fast forward
- FIG. 13 is a flowchart of an embodiment of a video processing method for reducing system resources utilized for fast forward function.
- FIG. 14 is a flowchart of an embodiment of a video processing method for frame skipping based on the fast forward shown in FIG. 13 .
- FIGS. 3 through 14 generally relate to video processing. It is to be understood that the following disclosure provides many different embodiments as examples, for implementing different features of the invention. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
- the invention discloses a video processing system capable of error resilience and a video processing method for same.
- the video processing method comprises features of AV synchronization resilience, adaptive Post-processing, dynamic frame dropping, auto pause/resume, and fast forward function. Individual examples are described in the following.
- FIG. 3 is a workflow of an embodiment of a video processing method for AV synchronization resilience.
- AV synchronization resilience is frequently required, especially for high frame rate display.
- Section 210 represents ideal video display, where the ideal duration for each frame equals 33.33 millisecond (ms) where video files are encoded at a rate of 30 frames per second and the decoding time for each frame is less than 33 ms.
- the display duration for each frame is not considered.
- Section 220 represents asynchronization solution for audio and video signals.
- Frame 1 is decoded while Frame 0 is decoded and displayed.
- the decoding of Frame 1 requires 43 ms, which is longer than the frame duration of 33.33 ms.
- the first asynchronization of audio and video signals is then detected during a predetermined time period (a first asynchronization time period) shown by the first star symbol.
- display of Frame 0 continues until Frame 1 is completely decoded while audio signals corresponding to Frame 1 are output.
- Frame 1 is displayed and Frame 2 is decoded in the following 32 ms.
- the second asynchronization of audio and video signals is detected due to the excessive decoding time of Frame 2 during a second asynchronization time period, as shown by the second star symbol.
- Frame 1 is displayed until Frame 2 is completely decoded while audio signals corresponding to Frame 2 are output.
- Frame 2 is displayed and Frame 3 is decoded in 20 ms.
- completely decoded Frame 3 can not be immediately displayed since the ideal duration for a frame has not passed, such that display of Frame 2 continues.
- Frame 3 is displayed and Frame 4 is decoded in 25 ms.
- completely decoded Frame 4 can not be displayed since the ideal duration for a frame has not passed, such that display of Frame 3 continues.
- Frame 4 is displayed and Frame 5 is decoded, and the process repeats for each frame until completion.
- AV asynchroinzation will be frequently detected and audio and video signals of AV files can be eventually synchronized due to discrepant decoding time.
- FIG. 4 is a flowchart of an embodiment of a video processing method for AV synchronization resilience.
- It is determined whether the decoding time for frame N+i (i i+1) (for example, Frame 1 ) exceeds a defined duration (33.33 ms, for example) (step S 12 ).
- Frames may be dropped when they can not be timely displayed.
- the following situations may result in frame dropping, comprising higher frame rate, multiple display devices, excessive post processes, and larger display frame size.
- higher frame rate some frames are dropped to reduce frame rate.
- multiple display devices insignificant frames displayed at a secondary display device can be dropped to reduce frame rate or frames are not output to a secondary display device. Since frames, for example, are output to TV and LCD, frame rate relating to frames output to the LCD can be reduced to save system resources. Additionally, frame defects may appear if Post-processing is disabled, such that frame dropping is implemented to solve the problem.
- FIG. 5 is a workflow of an embodiment of a video processing method for dynamic frame dropping with single display device.
- Section 310 represents ideal audio play along with ideal video display with ideal duration for displaying each frame equals 33.33 ms, where video files are encoded at 30 frames per second and a decoding time for each frame is less than 33.33 ms.
- a display duration for each frame is not considered.
- each frame can be timely decoded and displayed within each time interval of 33.33 ms.
- Section 320 represents the method of dynamic frame dropping with single display device of the present invention. As shown in Section 320 , between each time point ‘ 0 ’, ‘ 1 ’, ‘ 2 ’, ‘ 3 ’ and ‘ 4 ’, Frame 1 , Frame 2 , Frame 3 , Frame 4 and Frame 5 are sequentially decoded within each time interval of 33.33 ms.
- the time points ‘A’, ‘B’, ‘C’, ‘D’ and ‘E’ shows the actual display duration of the frames. Some of the frames may take more than 33.33 ms to display.
- Frame 0 is displayed at the time between the time point A and B for 60 ms, and Frame 1 is decoded between the time point ‘ 0 ’ and the time point ‘ 1 ’ for less than 33.33 ms.
- display of frame 0 takes longer than the ideal duration (33.33 ms).
- Frame 1 of the present invention will be dropped and will not be displayed.
- Frame 2 will be sequentially displayed. That is, in this embodiment, as long as the previous frame is still being displayed after one preset time interval (for example, 33.33 ms), at least one next frame will be dropped.
- FIG. 6 is a workflow of an embodiment of a video processing method for dynamic frame dropping for multiple display devices.
- Section 410 represents ideal audio play along with ideal video display with ideal duration for displaying each frame equals 33.33 ms, where video files are encoded at 30 frames per second and a decoding time for each frame is less than 33.33 ms.
- a display duration of each frame is not considered.
- this embodiment provides a first display device and a second display device.
- the first display device is used for displaying important data source whereas the second display device is used for displaying less important data source.
- it is restricted not to drop any frames even its previous frame takes more than one time intervals to display. But for less important data source, for efficiency, some frames may be dropped.
- Section 420 represents AV file display of the first display device and section 430 represents AV file display of the second display device which is allowed for dynamic frame dropping.
- Frame 0 is displayed between time point ‘ 0 ’ and the time point ‘ 3 ’ in the second display device where Frames 1 and 2 are decoded but not displayed. After Frame 0 finishes its display, the second display device will continue to display Frame 3 . The process repeats for each frame until completion.
- FIG. 7 is a flowchart of an embodiment of a video processing method for dynamic frame dropping.
- Post-processing such as deblocking and deringing methods, can further improve video quality but it may utilize more system resources. Thus, for speed concern, the present invention may turn off the post-processing operations to save system resources. Cooperating with AV synchronization resilience, Post-processing, for example, may be disabled when video signals of a current frame are asynchronous with audio signals.
- an auto/pause function may be automatically implemented. If file system and CPU access speed, for example, is too slow to process high-bit-rate files or large AV asynchronization is detected, the auto pause/resume function is automatically implemented to pause file processing for a period of time and resume video and audio signal synchronization.
- FIG. 8 is a workflow of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed.
- Section 510 represents ideal video display (30 frames per second), where each frame is completely decoded in 33.33 ms.
- Section 520 represents an auto pause/resume process for insufficient access speed.
- Frame 1 is decoded while Frame 0 is decoded and displayed, and Frame 1 is displayed while Frame 2 is decoded.
- file system/storage media access speed for processing AV files comprising high frame rate is inadequate, hence, Frames 0 , 1 and 2 are decoded and displayed but the file system/storage medium cannot prepare enough bitstream data (Assume FS speed is low) to decode frame 3 .
- an auto pause mode is enabled as shown by the start symbol, in which audio signals are stopped, and start to re-prepare the bitstream data to bistream buffer from Frame 3 through the file system/storage medium until the bistream buffer is full of many frame's bitstream data (Assume the duration is 50 ms). Then resume this video from frame 3 to be decoded/display.
- AV synchronization is enabled from Frame 3 (Frame 2 is display during preparing bitstream buffer. In this example, it is about 50 ms) and Frame 3 is displayed and Frame 4 is decoded, and Frame 4 is displayed and Frame 5 is decoded, and the process repeats for each frame until completion.
- FIG. 9 is a flowchart of an embodiment of a video processing method for auto pause/resume having insufficient file system/storage media access speed.
- a first frame (Frame 1 , for example) is decoded and displayed for a predetermined period of time while a second frame (Frame 2 , for example) is decoded (step S 31 ). It is determined whether the access speed of a file system/storage medium is not adequate enough (Bitstream data is not in Buffer) to process the third frame (Frame 3 , for example) (step S 32 ).
- an auto pause function is enabled, audio is disabled (step S 33 ), and re-prepare the bitsteram data until bitstream buffer is full, and a third frame (Frame 3 , for example) is decoded (step S 34 ), the auto pause function is disabled (step S 35 ), and an auto resume function and audio are enabled (step S 36 ). If not, the second frame (Frame 2 , for example) is displayed and the third frame (Frame 3 , for example) is decoded (step S 37 ), and the process repeats for each frame until completion.
- FIG. 10 is a workflow of an embodiment of a video processing method for auto pause/resume with insufficient decoding capability.
- Section 610 represents ideal video display (30 frames per second), where each frame is completely decoded in 33.33 ms.
- Section 620 represents an auto pause/resume process for inadequate decoding capability, in which decoding and display time for each frame is not considered and an acceptable non-synchronization time is less than 30 ms.
- Frame 1 is decoded in 59 ms while Frame 0 is decoded and is displayed, resulting in AV asynchronization, in which, compared to the duration of Frames 0 and 1 shown in section 610 , the asynchronization time (about 25.67 ms) less than 30 ms is acceptable, such that Frame 1 is then displayed and Frame 2 is decoded in 46 ms.
- Another AV asynchronization is detected due to the long decoding time over the duration of Frames 0 and 1 shown in Section 610 , in which, compared to the duration of Frames 1 and 2 shown in Section 610 , the asynchronization time (about 38.34 ms) greater than 30 ms is unacceptable, such that an auto pause mode is enabled as shown by the start symbol, in which audio signals are stopped, and the file system/storage medium decodes and stores bit stream data from Frame 3 to a bit stream buffer. The pause mode continues until bitstream buffer is full (50 ms, for example) and the pause mode is disabled and an auto resume mode is enabled.
- AV synchronization is enabled from Frame 3 such that Frame 2 is ignored and Frame 3 is displayed and Frame 4 is decoded, and Frame 4 is displayed and Frame 5 is decoded, and the process repeats for each frame until completion.
- FIG. 11 is a flowchart of an embodiment of a video processing method for auto pause/resume having insufficient file system/storage media access speed.
- a first frame (Frame 0 , for example) is decoded and displayed for a predetermined period of time while a second frame (Frame 1 , for example) is decoded (step S 41 ). It is determined whether AV asynchronization is detected and if an asynchronization time is greater than a threshold value (step S 42 ).
- an auto pause function is enabled for a predetermined period of time and audio is disabled (step S 43 ) while the second frame is ignored and a third frame (Frame 2 , for example) is decoded (step S 44 ), the auto pause function is disabled (step S 45 ), and an auto resume function and audio are enabled (step S 46 ). If not, the second frame (Frame 1 , for example) is displayed and the third frame (Frame 2 , for example) is decoded (step S 47 ), and the process repeats for each frame until completion.
- FIG. 12 is a workflow of an embodiment of a video processing method for reducing system resources utilized by a fast forward function.
- audio signals are disabled while a fast forward process is implemented to prevent AV asynchronization. While fast forwarding a video file, it is assumed that users are not concerned about audio output. Therefore, audio signals are disabled to allow more video frames to be decoded and displayed. Additionally, selectively frame dropping for a display device is enabled to reduce bus load, access and post-processing functions are completely or partially disabled to accelerate processing speed. Deblocking or deranging for example, occupy more system resources and may be disabled.
- Section 710 represents quadruple processing speed (4 ⁇ ) for fast forward, where each frame is displayed in 16.66 ms, video files are encoded at 15 frames per second, a display duration for each frame is not considered, and a decoding time is for each frame less than 16 ms.
- Section 720 represents a fast forward process with Nx (N>1) quadruplicate processing speed of the invention, in which a display time for all frames equals 10 ms and Post-processing functions for all decoded frames are disabled. Frame 1 is decoded while Frame 0 is decoded and is displayed.
- Frame 1 is not displayed due to the decoding time of 20 ms over an ideal duration (16.66 ms) while display of Frame 0 should terminate with 16.66 ms passing through but continues for another duration time (16.66 ms) since Frame 2 is decoded in 8 ms.
- Frame 2 is displayed in 10 ms and Frame 3 is decoded in 10 ms, and Frame 3 is normally displayed and Frame 4 is decoded in 28 ms.
- Frame 4 is not displayed due to the decoding time of 28 ms over an ideal duration (16.66 ms) while display of Frame 3 should terminate with 16.66 ms passing through but continues for another two time intervals (16.66*2 ms) since Frame 5 is decoded in 26 ms.
- Frame 5 is displayed in 10 ms and Frame 6 is decoded in 10 ms, and Frame 6 is displayed and Frame 7 is decoded in 8 ms, and the process repeats for each frame until completion. Note that a recount duration frame starts from frame 5 .
- a frame is skipped when decoding is slow.
- a frame is skipped when a previous frame cannot be displayed in time. If a previous frame is not displayed, at least one of the subsequent frames must be displayed. Additionally, frames can be periodically dropped. A frame, for example, is dropped at every interval of three frames.
- FIG. 14 is a flowchart of an embodiment of a video processing method for frame skipping for fast forward shown in FIG. 13 .
- Post-processing for all frames is first disabled (step S 61 ).
- It is determined whether the decoding time for frame N+1 (N+i, i 1) (Frame 1 ) exceeds a defined duration (33.33 ms, for example) (step S 63 ).
- FIGS. 4 , 7 , 9 , 11 , 13 , and 14 can also be implemented in various storage media.
- a video processing method capable of error resilience of the invention is capable of achieving AV synchronization resilience, adaptive post-processing, dynamic frame dropping for display, and auto pause/resume for system resources utilized by a fast forward function.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Television Signal Processing For Recording (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
A video processing method for AV synchronization resilience is provided. A first frame is decoded and displayed while a second frame is decoded for a predetermined period of time. It is determined whether the decoding time for the second frame exceeds a defined duration. If so, the first frame is continuously displayed during the decoding time for the second frame. The second frame is displayed while a third frame is decoded for a predetermined period of time.
Description
- This application claims the full benefit and priority of provisional U.S. patent application Ser. No. 60/754,142, filed Dec. 27, 2005, entitled “Video Processing System Capable of Error Resilience and Video Processing Method for Same”, and incorporates the entire contents of the application herein.
- 1. Field of the Invention
- The present invention relates to video processing, and in particular to a video processing system capable of error resilience and a video processing method for same.
- 2. Description of the Related Art
- MPEG-4 is an ISO/IEC standard developed by the MPEG (Moving Picture Experts Group), the committee that also developed the Emmy Award winning standards MPEG-1 and MPEG-2. MPEG-4 is the successor to MPEG-1 and MPEG-2 video standards. The ISO standards committee implemented this standard in 1998. Instead of the current frame-based video technology, MPEG-4, has adopted the object-oriented concept. The object-oriented concept integrates existing multimedia technologies, such as 2D and 3D graphics, animation, video codecs, multimedia streaming, interactivity, and programmatic environments into a single architecture.
- Video processing based on MPEG-4 format images is described in the following.
- A video stream is a sequence of video frames. Each frame is a still image. A video player displays individual frames sequentially, typically at a rate close to 30 frames per second. Frames are divided into 16×16 pixel macroblocks (not shown). Each macroblock consists of four 8×8 luminance blocks and two 8×8 chrominance blocks (1 U and 1 V) (not shown). Macroblocks are units for measuring motion-compensated compression. Blocks are used for DCT compression.
- Video data complying with MPEG-4 format files is composed of three different types of frames, comprising intra-frames (I-frames), forward predicted frames (P-frames), and bidirectional predicted frames (B-frames). An I-frame is encoded as a single image, with no reference to any past or future frames, referring to the fact that the various lossless and lossy compression techniques are performed on information contained only within a current frame, and not relative to any other frame in the video sequence. In other words, no temporal processing is performed outside the current picture or frame. A P-frame is encoded relative to the past reference frame. A reference frame is a P- or an I-frame. The past reference frame is the closest preceding reference frame. Each macroblock in a P-frame can be encoded either as an Intra-macroblock or as an Inter-macroblock. In an I-frame, encoded with no reference, intra-macroblocks are encoded in the same manner as macroblocks. A B-frame is encoded relative to the past reference frame, the future reference frame, or both frames. The future reference frame is the closest following reference frame (I or P). The encoding for B-frames is similar to P-frames, except that motion vectors may refer to areas in the future reference frames. For macroblocks that use both past and future reference frames, the two 16×16 areas are averaged.
- As previously described, a video stream is a sequence of video frames. An input encoded sequence of video frames, for example, is represented as “I(0) P(3) B(1) B(2) P(6) B(4) B(5) I(9) B(7) B(8) P(12) B(11) P(12)” (the bracketed number is the display order of each frame). The output decoded sequence of the video frames is then represented as “I(0) B(1) B(2) P(3) B(4) B(5) P(6) B(7) B(8) I(9) B(10) B(11) P(12)”.
- Video data can be further composed of combined frames, each comprising both a P-frame and a B-frame or an I-frame and a B-frame, and empty frames (Pe). Referring to
FIG. 1 , which shows a schematic diagram of a combined frame 100 (regarded as one frame to be introduced in the video data), comprising a P-frame and a B-frame, encoding processes of each frame are identical to independent ones. Pe frames indicate predicted frames not comprising video data. - The foregoing descriptions explain the types of frames of video data and input and output sequences in a video stream. Video data translated into playable video files must be encoded as MPEG-1/2/4, or other video formats due to the considerable file size to be played in a video system. A video system for playing video and audio (AV) files is described in the following.
FIG. 2 is a schematic view of a video system.Video system 100 comprisesstorage medium 110, afile system 120, afile parser 130, avideo decoder 140, a first Post-processingunit 145, anaudio decoder 150, a second Post-processingunit 155, anAV synchronization unit 160, and adisplay system 170.Display system 170 is comprises a multiple display function. AV files are first stored instorage medium 110 andfile system 120 accesses AV files therefrom. Next,file parser 130 parses the AV files to retrieve video and audio signals and pass the signals tovideo decoder 140 andaudio decoder 150 respectively.Video decoder 140 andaudio decoder 150 decode the video and audio signals and 145 and 155 implement post-processing operations, such as deblocking or deringing, to adjust the quality of video and audio signals respectively. The adjusted signals are transmitted toPost-processing units AV synchronization unit 160 for signal synchronization and are output to displaysystem 170 to display the AV files usingfirst display device 171 or bothfirst display device 171 andsecond display device 173. - Playback quality is limited in that video files may not play smoothly in a video system comprising restricted memory resources, central processing unit (CPU) speed, or system architecture. Resource limitations are described in the following. High bit rate files (bit per second (bps)) must be accessed by file systems or storage media of adequate operating speed. Higher frame rate or a fast forward function for video decoding requires relatively better hardware performance. Post-processing, such as deblocking and deringing methods, can improve video quality but employs more system resources. Additionally, greater memory bandwidth in liquid crystal displays and TVs is required if larger frame size is requested. As described, higher frame rate, larger frame size, higher bit rate, increased post-processing, or more display elements require high-grade systems for high performance. Limitations of embedded systems included blurry video images and failure of audio signal playback.
- Thus, a video processing method for a video processing system capable of error resilience is desirable.
- A video processing method for AV synchronization resilience is provided. A first frame is decoded and displayed while a second frame is decoded for a predetermined period of time. It is determined whether the decoding time for the second frame exceeds a defined duration. If so, the first frame is continuously displayed during the decoding time for the second frame, and the second frame is displayed while a third frame is decoded for a predetermined period of time.
- A video processing method for dynamic frame dropping is provided. A first frame is decoded and displayed while a second frame is decoded. It is determined whether the display time for the first frame exceeds a defined duration. If the display time for the first frame exceeds the defined duration, the second frame is dropped, and the first frame is continuously displayed while a third frame is decoded for a predetermined period of time.
- A video processing method for dynamic frame dropping is provided. A plurality of frames is provided. It is determined whether a portion of the frames is selectively ignored, and, if so, the frames are displayed at a predetermined number of frame intervals.
- A video processing method for auto pause/resume, applied to a file system/storage medium, is provided. A first frame is decoded and displayed for a predetermined period of time while a second frame is decoded. It is determined whether the access speed of the file system/storage medium is not adequate enough is adequate for processing the second frame. If the access speed of the file system/storage medium is not adequate enough to process the second frame, an auto pause function is enabled for a predetermined period of time and audio signals are disabled while the second frame is ignored and a third frame is decoded. The auto pause function is disabled, and an auto resume function and the audio signals are enabled.
- A video processing method capable of reducing system resources utilized for fast forward is provided. It is determined whether AV files are played with N multiples. If the AV files are played with N multiples, audio signals of the AV files are disabled. Video signals of the AV files are played with the N multiples. Post-processing is disabled and negligible frames of the AV files are not displayed by a display system.
- A video processing method for frame skipping for fast forward is provided. Post-processing for frames of AV files is disabled. A first frame is decoded and displayed while a second frame is decoded for a predetermined period of time. It is determined whether decoding time for the second frame exceeds a defined duration. If the decoding time for the second frame exceeds the defined duration, the first frame is continuously displayed during the decoding time for the second frame, and the second frame is not displayed while a third frame is decoded for a predetermined period of time.
- A detailed description is given in the following embodiments with reference to the accompanying drawings.
- The invention can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
-
FIG. 1 is a schematic view of a combined frame comprising a P-frame and a B-frame; -
FIG. 2 is a schematic view of a video system; -
FIG. 3 is a workflow of an embodiment of a video processing method for AV synchronization resilience; -
FIG. 4 is a flowchart of an embodiment of a video processing method for AV synchronization resilience; -
FIG. 5 is a workflow of an embodiment of a video processing method for dynamic frame dropping for a single display device; -
FIG. 6 is a workflow of an embodiment of a video processing method for dynamic frame dropping for at least one first display device and one second display device; -
FIG. 7 is a flowchart of an embodiment of a video processing method for dynamic frame dropping; -
FIG. 8 is a workflow of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed; -
FIG. 9 is a flowchart of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed having insufficient file system/storage media access speed; -
FIG. 10 is a workflow of an embodiment of a video processing method for an auto pause/resume function with insufficient decoding capability; -
FIG. 11 is a flowchart of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed; -
FIG. 12 is a workflow of an embodiment of a video processing method for reducing system resources utilized for fast forward; -
FIG. 13 is a flowchart of an embodiment of a video processing method for reducing system resources utilized for fast forward function; and -
FIG. 14 is a flowchart of an embodiment of a video processing method for frame skipping based on the fast forward shown inFIG. 13 . - Several exemplary embodiments of the invention are described with reference to
FIGS. 3 through 14 , which generally relate to video processing. It is to be understood that the following disclosure provides many different embodiments as examples, for implementing different features of the invention. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. - The invention discloses a video processing system capable of error resilience and a video processing method for same. The video processing method comprises features of AV synchronization resilience, adaptive Post-processing, dynamic frame dropping, auto pause/resume, and fast forward function. Individual examples are described in the following.
-
FIG. 3 is a workflow of an embodiment of a video processing method for AV synchronization resilience. AV synchronization resilience is frequently required, especially for high frame rate display. As shown inFIG. 3 ,Section 210 represents ideal video display, where the ideal duration for each frame equals 33.33 millisecond (ms) where video files are encoded at a rate of 30 frames per second and the decoding time for each frame is less than 33 ms. Here, the display duration for each frame is not considered. - In reality,
Section 220 represents asynchronization solution for audio and video signals. As shown inFIG. 3 ,Frame 1 is decoded whileFrame 0 is decoded and displayed. The decoding ofFrame 1 requires 43 ms, which is longer than the frame duration of 33.33 ms. After the decoding, the first asynchronization of audio and video signals is then detected during a predetermined time period (a first asynchronization time period) shown by the first star symbol. At the first asynchronization time period, display ofFrame 0 continues untilFrame 1 is completely decoded while audio signals corresponding to Frame 1 are output. Next, after the decoding ofFrame 1 is completed,Frame 1 is displayed andFrame 2 is decoded in the following 32 ms. The second asynchronization of audio and video signals is detected due to the excessive decoding time ofFrame 2 during a second asynchronization time period, as shown by the second star symbol. During the second asynchronization time period,Frame 1 is displayed untilFrame 2 is completely decoded while audio signals corresponding to Frame 2 are output. - Next, after the decoding of
Frame 2 is completed,Frame 2 is displayed andFrame 3 is decoded in 20 ms. Herein, completely decodedFrame 3 can not be immediately displayed since the ideal duration for a frame has not passed, such that display ofFrame 2 continues. Next, when the ideal duration for a frame expires,Frame 3 is displayed andFrame 4 is decoded in 25 ms. Similarly, completely decodedFrame 4 can not be displayed since the ideal duration for a frame has not passed, such that display ofFrame 3 continues. Next,Frame 4 is displayed andFrame 5 is decoded, and the process repeats for each frame until completion. In this embodiment, AV asynchroinzation will be frequently detected and audio and video signals of AV files can be eventually synchronized due to discrepant decoding time. -
FIG. 4 is a flowchart of an embodiment of a video processing method for AV synchronization resilience. Frame N+i (i=0) (for example, Frame 0) is decoded and displayed while frame N+i (i=i+1) (for example, Frame 1) is decoded for a predetermined period of time (step S11), where N=0 and i=0˜n. It is determined whether the decoding time for frame N+i (i=i+1) (for example, Frame 1) exceeds a defined duration (33.33 ms, for example) (step S12). If so, the display of frame N+i (Frame 0) continues for the duration of the decoding for frame N+i (i=i+1) (for example, Frame 1) and frame N+i (i=i+1) (for example, Frame 1) is then displayed while frame N+i (i=i+2) (for example, Frame 2) is decoded for a predetermined period of time (step S13), i=i+1 (step S15), and the process proceeds to step S11. If not, frame N+i (i=i+1) (for example, Frame 1) is then displayed while frame N+i (i=i+2) (for example, Frame 2) is decoded for a predetermined period of time (step S14), i=i+1 (step S15), and the process proceeds to step S12. - Frames may be dropped when they can not be timely displayed. The following situations may result in frame dropping, comprising higher frame rate, multiple display devices, excessive post processes, and larger display frame size. With respect to higher frame rate, some frames are dropped to reduce frame rate. With respect to multiple display devices, insignificant frames displayed at a secondary display device can be dropped to reduce frame rate or frames are not output to a secondary display device. Since frames, for example, are output to TV and LCD, frame rate relating to frames output to the LCD can be reduced to save system resources. Additionally, frame defects may appear if Post-processing is disabled, such that frame dropping is implemented to solve the problem.
-
FIG. 5 is a workflow of an embodiment of a video processing method for dynamic frame dropping with single display device. As shown inFIG. 5 ,Section 310 represents ideal audio play along with ideal video display with ideal duration for displaying each frame equals 33.33 ms, where video files are encoded at 30 frames per second and a decoding time for each frame is less than 33.33 ms. Here, a display duration for each frame is not considered. InSection 310, ideally, each frame can be timely decoded and displayed within each time interval of 33.33 ms. - In reality, frames however may not be able to completely display within each preset time interval.
Section 320 represents the method of dynamic frame dropping with single display device of the present invention. As shown inSection 320, between each time point ‘0’, ‘1’, ‘2’, ‘3’ and ‘4’,Frame 1,Frame 2,Frame 3,Frame 4 andFrame 5 are sequentially decoded within each time interval of 33.33 ms. The time points ‘A’, ‘B’, ‘C’, ‘D’ and ‘E’, on the other hand shows the actual display duration of the frames. Some of the frames may take more than 33.33 ms to display. - As shown in
FIG. 5 ,Frame 0 is displayed at the time between the time point A and B for 60 ms, andFrame 1 is decoded between the time point ‘0’ and the time point ‘1’ for less than 33.33 ms. Herein, display offrame 0 takes longer than the ideal duration (33.33 ms). In the second time interval between the time point ‘1’ and the time point ‘2’, sinceFrame 0 has not finished its display,Frame 1 of the present invention will be dropped and will not be displayed. Between the time point ‘2’ and the time point ‘3’ where the display ofFrame 0 is complete,Frame 2 will be sequentially displayed. That is, in this embodiment, as long as the previous frame is still being displayed after one preset time interval (for example, 33.33 ms), at least one next frame will be dropped. -
FIG. 6 is a workflow of an embodiment of a video processing method for dynamic frame dropping for multiple display devices. As shown inFIG. 6 ,Section 410 represents ideal audio play along with ideal video display with ideal duration for displaying each frame equals 33.33 ms, where video files are encoded at 30 frames per second and a decoding time for each frame is less than 33.33 ms. Here, a display duration of each frame is not considered. Different from the embodiment shown inFIG. 5 , this embodiment provides a first display device and a second display device. The first display device is used for displaying important data source whereas the second display device is used for displaying less important data source. In this embodiment, for important data source, it is restricted not to drop any frames even its previous frame takes more than one time intervals to display. But for less important data source, for efficiency, some frames may be dropped. - In this embodiment,
Section 420 represents AV file display of the first display device andsection 430 represents AV file display of the second display device which is allowed for dynamic frame dropping. As shown inFIG. 6 ,Frame 0 is displayed between time point ‘0’ and the time point ‘3’ in the second display device where 1 and 2 are decoded but not displayed. AfterFrames Frame 0 finishes its display, the second display device will continue to displayFrame 3. The process repeats for each frame until completion. -
FIG. 7 is a flowchart of an embodiment of a video processing method for dynamic frame dropping. Frame N+i (i=0) (Frame 0) is decoded and displayed for a predetermined period of time while frame N+i (i=i+1) (Frame 1) is decoded (step S21), where N=0 and i=0˜n. It is determined whether the display time for frame N+i (Frame 0) exceeds a defined duration (33.33 ms, for example) (step S22). If so, frame N+i (i=i+1) (Frame 1) is dropped and the display of frame N+i (Frame 0) continues while frame N+i (i=i+2) (Frame 2) is decoded for a predetermined period of time (step S23), i=i+2 (step S24), and the process proceeds to step S21. If not, frame N+i (i=i+1) (Frame 1) is then displayed while N+i (i=i+2) (Frame 2) is decoded for a predetermined period of time (step S25), i=i+1 (step S26), and the process proceeds to step S21. - Post-processing, such as deblocking and deringing methods, can further improve video quality but it may utilize more system resources. Thus, for speed concern, the present invention may turn off the post-processing operations to save system resources. Cooperating with AV synchronization resilience, Post-processing, for example, may be disabled when video signals of a current frame are asynchronous with audio signals.
- In some embodiment, when system resources are insufficient, an auto/pause function may be automatically implemented. If file system and CPU access speed, for example, is too slow to process high-bit-rate files or large AV asynchronization is detected, the auto pause/resume function is automatically implemented to pause file processing for a period of time and resume video and audio signal synchronization.
-
FIG. 8 is a workflow of an embodiment of a video processing method for an auto pause/resume function having insufficient file system/storage media access speed. As shown inFIG. 8 ,Section 510 represents ideal video display (30 frames per second), where each frame is completely decoded in 33.33 ms.Section 520 represents an auto pause/resume process for insufficient access speed. Here,Frame 1 is decoded whileFrame 0 is decoded and displayed, andFrame 1 is displayed whileFrame 2 is decoded. In this embodiment, file system/storage media access speed for processing AV files comprising high frame rate is inadequate, hence, Frames 0, 1 and 2 are decoded and displayed but the file system/storage medium cannot prepare enough bitstream data (Assume FS speed is low) to decodeframe 3. Thus, an auto pause mode is enabled as shown by the start symbol, in which audio signals are stopped, and start to re-prepare the bitstream data to bistream buffer fromFrame 3 through the file system/storage medium until the bistream buffer is full of many frame's bitstream data (Assume the duration is 50 ms). Then resume this video fromframe 3 to be decoded/display. Thus, AV synchronization is enabled from Frame 3 (Frame 2 is display during preparing bitstream buffer. In this example, it is about 50 ms) andFrame 3 is displayed andFrame 4 is decoded, andFrame 4 is displayed andFrame 5 is decoded, and the process repeats for each frame until completion. -
FIG. 9 is a flowchart of an embodiment of a video processing method for auto pause/resume having insufficient file system/storage media access speed. A first frame (Frame 1, for example) is decoded and displayed for a predetermined period of time while a second frame (Frame 2, for example) is decoded (step S31). It is determined whether the access speed of a file system/storage medium is not adequate enough (Bitstream data is not in Buffer) to process the third frame (Frame 3, for example) (step S32). If so, an auto pause function is enabled, audio is disabled (step S33), and re-prepare the bitsteram data until bitstream buffer is full, and a third frame (Frame 3, for example) is decoded (step S34), the auto pause function is disabled (step S35), and an auto resume function and audio are enabled (step S36). If not, the second frame (Frame 2, for example) is displayed and the third frame (Frame 3, for example) is decoded (step S37), and the process repeats for each frame until completion. -
FIG. 10 is a workflow of an embodiment of a video processing method for auto pause/resume with insufficient decoding capability. As shown inFIG. 10 ,Section 610 represents ideal video display (30 frames per second), where each frame is completely decoded in 33.33 ms.Section 620 represents an auto pause/resume process for inadequate decoding capability, in which decoding and display time for each frame is not considered and an acceptable non-synchronization time is less than 30 ms.Frame 1 is decoded in 59 ms whileFrame 0 is decoded and is displayed, resulting in AV asynchronization, in which, compared to the duration of 0 and 1 shown inFrames section 610, the asynchronization time (about 25.67 ms) less than 30 ms is acceptable, such thatFrame 1 is then displayed andFrame 2 is decoded in 46 ms. Another AV asynchronization is detected due to the long decoding time over the duration of 0 and 1 shown inFrames Section 610, in which, compared to the duration of 1 and 2 shown inFrames Section 610, the asynchronization time (about 38.34 ms) greater than 30 ms is unacceptable, such that an auto pause mode is enabled as shown by the start symbol, in which audio signals are stopped, and the file system/storage medium decodes and stores bit stream data fromFrame 3 to a bit stream buffer. The pause mode continues until bitstream buffer is full (50 ms, for example) and the pause mode is disabled and an auto resume mode is enabled. Thus, AV synchronization is enabled fromFrame 3 such thatFrame 2 is ignored andFrame 3 is displayed andFrame 4 is decoded, andFrame 4 is displayed andFrame 5 is decoded, and the process repeats for each frame until completion. -
FIG. 11 is a flowchart of an embodiment of a video processing method for auto pause/resume having insufficient file system/storage media access speed. A first frame (Frame 0, for example) is decoded and displayed for a predetermined period of time while a second frame (Frame 1, for example) is decoded (step S41). It is determined whether AV asynchronization is detected and if an asynchronization time is greater than a threshold value (step S42). If so, an auto pause function is enabled for a predetermined period of time and audio is disabled (step S43) while the second frame is ignored and a third frame (Frame 2, for example) is decoded (step S44), the auto pause function is disabled (step S45), and an auto resume function and audio are enabled (step S46). If not, the second frame (Frame 1, for example) is displayed and the third frame (Frame 2, for example) is decoded (step S47), and the process repeats for each frame until completion. -
FIG. 12 is a workflow of an embodiment of a video processing method for reducing system resources utilized by a fast forward function. In this embodiment, audio signals are disabled while a fast forward process is implemented to prevent AV asynchronization. While fast forwarding a video file, it is assumed that users are not concerned about audio output. Therefore, audio signals are disabled to allow more video frames to be decoded and displayed. Additionally, selectively frame dropping for a display device is enabled to reduce bus load, access and post-processing functions are completely or partially disabled to accelerate processing speed. Deblocking or deranging for example, occupy more system resources and may be disabled. - As shown in
FIG. 12 , compared to standard processing speed,Section 710 represents quadruple processing speed (4×) for fast forward, where each frame is displayed in 16.66 ms, video files are encoded at 15 frames per second, a display duration for each frame is not considered, and a decoding time is for each frame less than 16 ms.Section 720 represents a fast forward process with Nx (N>1) quadruplicate processing speed of the invention, in which a display time for all frames equals 10 ms and Post-processing functions for all decoded frames are disabled.Frame 1 is decoded whileFrame 0 is decoded and is displayed.Frame 1 is not displayed due to the decoding time of 20 ms over an ideal duration (16.66 ms) while display ofFrame 0 should terminate with 16.66 ms passing through but continues for another duration time (16.66 ms) sinceFrame 2 is decoded in 8 ms. Next,Frame 2 is displayed in 10 ms andFrame 3 is decoded in 10 ms, andFrame 3 is normally displayed andFrame 4 is decoded in 28 ms.Frame 4 is not displayed due to the decoding time of 28 ms over an ideal duration (16.66 ms) while display ofFrame 3 should terminate with 16.66 ms passing through but continues for another two time intervals (16.66*2 ms) sinceFrame 5 is decoded in 26 ms. Next,Frame 5 is displayed in 10 ms andFrame 6 is decoded in 10 ms, andFrame 6 is displayed andFrame 7 is decoded in 8 ms, and the process repeats for each frame until completion. Note that a recount duration frame starts fromframe 5. - In this embodiment, a frame is skipped when decoding is slow. A frame is skipped when a previous frame cannot be displayed in time. If a previous frame is not displayed, at least one of the subsequent frames must be displayed. Additionally, frames can be periodically dropped. A frame, for example, is dropped at every interval of three frames.
-
FIG. 13 is a flowchart of an embodiment of a video processing method for reducing system resources utilized by a fast forward function. It is determined whether AV files are played with N multiples (Nx speed, N!=1) (step S51). If so, audio is disabled, video files of the AV files are played with Nx speed (step S52), Post-processing is disabled (step S53), and negligible frames are skipped (step S54) for display by a display system (step S58). If not, audio and video signals are respectively enabled (step S55) decoded and implemented by post-processing operations (step S56). The processed audio and video files are synchronized (step S57) display by a display system (step S58). -
FIG. 14 is a flowchart of an embodiment of a video processing method for frame skipping for fast forward shown inFIG. 13 . Post-processing for all frames is first disabled (step S61). Frame N (N+i, i=0) (Frame 0) is decoded and displayed while frame N+1 (N+i, i=1) (Frame 1) is decoded for a predetermined period of time (step S62), where N=0 and i=0˜n. It is determined whether the decoding time for frame N+1 (N+i, i=1) (Frame 1) exceeds a defined duration (33.33 ms, for example) (step S63). If so, the display of frame N (N+i, i=0) (Frame 0) continues for the duration of the decoding of frame N+1 (N+i, i=1) (Frame 1) and frame N+1 (N+i, i=1) (Frame 1) is not displayed while frame N+2 (N+i, i=2) (Frame 2) is decoded for a predetermined period of time (step S64), i=i+2 (step S65), and the process proceeds to step S62. If not, frame N+1 (N+i, i=1) (Frame 1) is then displayed while frame N+2 (N+i, i=2) (Frame 2) is decoded for a predetermined period of time (step S66), i=i+1 (step S67), and the process proceeds to step S62. - Note that the described methods shown in
FIGS. 4 , 7, 9, 11, 13, and 14 can also be implemented in various storage media. - A video processing method capable of error resilience of the invention is capable of achieving AV synchronization resilience, adaptive post-processing, dynamic frame dropping for display, and auto pause/resume for system resources utilized by a fast forward function.
- While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. To the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Claims (30)
1. A video processing method for AV synchronization resilience, comprising:
decoding and displaying a first frame while a second frame is decoded for a predetermined period of time;
determining whether the decoding time for the second frame exceeds a defined duration;
if the decoding time for the second frame exceeds the defined duration, continuously displaying the first frame by the decoding time for the second frame; and
displaying the second frame while a third frame is decoded for a predetermined period of time.
2. The video processing method as claimed in claim 1 , further comprising, if the decoding time for the second frame is equal to the defined duration, displaying the second frame while the third frame is decoded for the predetermined time when the display of the first frame is complete.
3. The video processing method as claimed in claim 2 , further comprising, if the decoding time for the second frame is less than the defined duration, continuously displaying the first frame and, when an ideal duration for a frame expires, displaying the second frame while the third frame is decoded for the predetermined time.
4. A video processing method for dynamic frame dropping, comprising:
decoding and displaying a first frame while a second frame is decoded;
determining whether the display time for the first frame exceeds a defined duration;
if the display time for the first frame exceeds the defined duration, dropping the second frame; and
continuously displaying the first frame while a third frame is decoded for a predetermined period of time.
5. The video processing method as claimed in claim 4 , further comprising, if the display duration for the first frame is equal to the defined duration, displaying the second frame while the third frame is decoded for the predetermined time.
6. The video processing method as claimed in claim 5 , further comprising, if the display time for the first frame is less than the defined duration, continuously displaying the first frame by the defined duration.
7. A video processing method for dynamic frame dropping, comprising:
providing a plurality of frames;
determining whether a portion of the frames is selectively ignored; and
if so, displaying the frames at intervals of a predetermined number of frames.
8. A video processing method for auto pause/resume, applied to a file system/storage medium, comprising:
decoding and displaying a first frame for a predetermined period of time while a second frame is decoded;
determining whether access speed of the file system/storage medium is not adequate enough adequate enough to process the second frame;
if the access speed of the file system/storage medium is not adequate enough to process the second frame, enabling an auto pause function for a predetermined period of time and disabling audio signals while the second frame is ignored and a third frame is decoded; and
disabling the auto pause function; and
enabling an auto resume function and the audio signals.
9. The video processing method as claimed in claim 8 , further comprising, if the access speed of the file system/storage medium is adequate enough to process the second frame, displaying the second frame and decoding the third frame.
10. The video processing method as claimed in claim 8 , further comprising:
determining whether decoding ability of the file system/storage medium is not adequate enough to process the second frame to result in AV asynchronization;
if the decoding ability of the file system/storage medium is not adequate enough to process the second frame to result in AV asynchronization, determining whether asynchronization time is acceptable;
if the asynchronization time is unacceptable, enabling an auto pause mode and disabling the audio signals;
decoding and storing bitstream data relating to the third frame in a bitstream buffer;
disabling the pause mode and enabling an auto resume mode, thereby ignoring the second frame and displaying the third frame while a fourth frame is decoded.
11. The video processing method as claimed in claim 10 , further comprising, if the asynchronization time is acceptable, displaying the second frame and decoding the third frame.
12. A video processing method for reducing system resources utilized by a fast forward function, comprising:
determining whether AV files are played with N multiples;
if the AV files are played with N multiples, disabling audio signals of the AV files;
playing video signals of the AV files with the N multiples;
disabling Post-processing; and
skipping negligible frames of the AV files to be displayed by a display system.
13. The video processing method as claimed in claim 12 , further comprising:
if the AV files are not played with the N multiples, enabling and decoding the audio and video signals;
implementing Post-processing to the audio and video signals respectively;
synchronizing the processed audio and video signals to be displayed by the display system.
14. A video processing method for frame skipping for fast forward, comprising:
disabling Post-processing for frames of AV files;
decoding and displaying a first frame while a second frame is decoded for a predetermined period of time;
determining whether decoding time for the second frame exceeds a defined duration;
if the decoding time for the second frame exceeds the defined duration, continuously displaying the first frame by the decoding time for the second frame; and
not displaying the second frame while a third frame is decoded for a predetermined period of time.
15. The video processing method as claimed in claim 14 , further comprising, if the decoding time for the second frame does not exceed the defined duration, displaying the second frame while the third frame is decoded for the predetermined time.
16. A computer-readable storage medium storing a computer program providing a video processing method for AV synchronization resilience, comprising using a computer to perform the steps of:
decoding and displaying a first frame while a second frame is decoded for a predetermined period of time;
determining whether the decoding time for the second frame exceeds a defined duration;
if the decoding time for the second frame exceeds the defined duration, continuously displaying the first frame by the decoding time for the second frame until second frame is decoded complete; and
displaying the second frame while a third frame is decoded for a predetermined period of time.
17. The computer-readable storage medium as claimed in claim 16 , further comprising, if the decoding time for the second frame is equal to the defined duration, displaying the second frame while the third frame is decoded for the predetermined time when the display of the first frame is complete.
18. The computer-readable storage medium as claimed in claim 17 , further comprising, if the decoding time for the second frame is less than the defined duration, continuously displaying the first frame and, when an ideal duration for a frame expires, displaying the second frame while the third frame is decoded for the predetermined time.
19. A computer-readable storage medium storing a computer program providing a video processing method for dynamic frame dropping, comprising using a computer to perform the steps of:
decoding and displaying a first frame while a second frame is decoded;
determining whether the display time for the first frame exceeds a defined duration;
if the display time for the first frame exceeds the defined duration, dropping the second frame; and
continuously displaying the first frame while a third frame is decoded for a predetermined period of time.
20. The computer-readable storage medium as claimed in claim 19 , further comprising, if the display duration for the first frame is equal to the defined duration, displaying the second frame while the third frame is decoded for the predetermined time.
21. The computer-readable storage medium as claimed in claim 20 , further comprising, if the display time for the first frame is less than the defined duration, continuously displaying the first frame by the defined duration.
22. A computer-readable storage medium storing a computer program providing a video processing method for dynamic frame dropping, comprising using a computer to perform the steps of:
providing a plurality of frames;
determining whether a portion of the frames is selectively ignored; and
if so, displaying the frames at a predetermined number of frame intervals.
23. A computer-readable storage medium storing a computer program providing a video processing method for auto pause/resume, comprising using a computer to perform the steps of:
decoding and displaying a first frame for a predetermined period of time while a second frame is decoded;
determining whether access speed of a file system/storage medium is not adequate enough to process the second frame;
if the access speed of the file system/storage medium is not adequate enough to process the second frame, enabling an auto pause function for a predetermined period of time and disabling audio signals while the second frame is ignored and a third frame is decoded; and
disabling the auto pause function; and
enabling an auto resume function and the audio signals.
24. The computer-readable storage medium as claimed in claim 23 , further comprising, if the access speed of the file system/storage medium is adequate enough to process the second frame, displaying the second frame and decoding the third frame.
25. The computer-readable storage medium as claimed in claim 23 , further comprising:
determining whether decoding ability of the file system/storage medium is not adequate enough to process the second frame to result in AV asynchronization;
if the decoding ability of the file system/storage medium is not adequate enough to process the second frame to result in AV asynchronization, determining whether asynchronization time is acceptable;
if the asynchronization time is unacceptable, enabling an auto pause mode and disabling the audio signals;
decoding and storing bitstream data relating to the third frame in a bitstream buffer;
disabling the pause mode and enabling an auto resume mode, thereby ignoring the second frame and displaying the third frame while a fourth frame is decoded.
26. The computer-readable storage medium as claimed in claim 25 , further comprising, if the asynchronization time is acceptable, displaying the second frame and decoding the third frame.
27. A computer-readable storage medium storing a computer program providing a video processing method for reducing system resources utilized by a fast forward function, comprising using a computer to perform the steps of:
determining whether AV files are played with N multiples;
if the AV files are played with N multiples, disabling audio signals of the AV files;
playing video signals of the AV files with the N multiples;
disabling Post-processing; and
skipping negligible frames of the AV files to be displayed by a display system.
28. The computer-readable storage medium as claimed in claim 27 , further comprising:
if the AV files are not played with the N multiples, enabling and decoding the audio and video signals;
implementing Post-processing to the audio and video signals respectively;
synchronizing the processed audio and video signals to be displayed by the display system.
29. A computer-readable storage medium storing a computer program providing a video processing method for frame skipping for fast forward, comprising using a computer to perform the steps of:
disabling Post-processing for frames of AV files;
decoding and displaying a first frame while a second frame is decoded for a predetermined period of time;
determining whether decoding time for the second frame exceeds a defined duration;
if the decoding time for the second frame exceeds the defined duration, continuously displaying the first frame by the decoding time for the second frame; and
not displaying the second frame while a third frame is decoded for a predetermined period of time.
30. The computer-readable storage medium as claimed in claim 29 , further comprising, if the decoding time for the second frame does not exceed the defined duration, displaying the second frame while the third frame is decoded for the predetermined time.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/464,845 US20070147517A1 (en) | 2005-12-27 | 2006-08-16 | Video processing system capable of error resilience and video processing method for same |
| TW095145648A TWI342162B (en) | 2005-12-27 | 2006-12-07 | Video processing method and computer-readable storage medium therefor |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US75414205P | 2005-12-27 | 2005-12-27 | |
| US11/464,845 US20070147517A1 (en) | 2005-12-27 | 2006-08-16 | Video processing system capable of error resilience and video processing method for same |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20070147517A1 true US20070147517A1 (en) | 2007-06-28 |
Family
ID=38214776
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/464,845 Abandoned US20070147517A1 (en) | 2005-12-27 | 2006-08-16 | Video processing system capable of error resilience and video processing method for same |
| US11/464,850 Active 2029-04-04 US7738325B2 (en) | 2005-12-27 | 2006-08-16 | Reading and writing methods and apparatus for Blu-Rays discs |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/464,850 Active 2029-04-04 US7738325B2 (en) | 2005-12-27 | 2006-08-16 | Reading and writing methods and apparatus for Blu-Rays discs |
Country Status (3)
| Country | Link |
|---|---|
| US (2) | US20070147517A1 (en) |
| CN (1) | CN100539697C (en) |
| TW (1) | TWI342162B (en) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080316217A1 (en) * | 2007-06-25 | 2008-12-25 | Microsoft Corporation | Hard/Soft Frame Latency Reduction |
| US20100080166A1 (en) * | 2008-09-30 | 2010-04-01 | Qualcomm Incorporated | Techniques for supporting relay operation in wireless communication systems |
| WO2010038197A1 (en) * | 2008-09-30 | 2010-04-08 | Nxp B.V. | Profile for frame rate conversion |
| US20100097978A1 (en) * | 2008-10-20 | 2010-04-22 | Qualcomm Incorporated | Data transmission via a relay station in a wireless communication system |
| US20120194487A1 (en) * | 2011-01-27 | 2012-08-02 | Wolfgang Roethig | Master Synchronization for Multiple Displays |
| CN103177001A (en) * | 2011-12-21 | 2013-06-26 | 腾讯科技(深圳)有限公司 | Method and device for achieving smooth switching of pictures |
| US20140341307A1 (en) * | 2013-05-20 | 2014-11-20 | Playcast Media Systems, Ltd. | Overcoming lost ip packets in streaming video in ip networks |
| US10291936B2 (en) | 2017-08-15 | 2019-05-14 | Electronic Arts Inc. | Overcoming lost or corrupted slices in video streaming |
| US10937379B2 (en) * | 2017-05-12 | 2021-03-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Display device and driving method thereof |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7539926B1 (en) * | 2006-02-14 | 2009-05-26 | Xilinx, Inc. | Method of correcting errors stored in a memory array |
| WO2007096834A1 (en) * | 2006-02-24 | 2007-08-30 | Koninklijke Philips Electronics N.V. | Method and device for reading data |
| TWI424371B (en) * | 2009-12-30 | 2014-01-21 | Altek Corp | Video processing device and processing method thereof |
| KR20140039504A (en) * | 2012-09-24 | 2014-04-02 | 삼성전자주식회사 | Blu-ray disc playback device and method thereof |
| TWI483229B (en) * | 2012-12-22 | 2015-05-01 | Chunghwa Picture Tubes Ltd | Display apparatus and method for processing frame thereof |
| CN112601127B (en) * | 2020-11-30 | 2023-03-24 | Oppo(重庆)智能科技有限公司 | Video display method and device, electronic equipment and computer readable storage medium |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5539663A (en) * | 1993-11-24 | 1996-07-23 | Intel Corporation | Process, apparatus and system for encoding and decoding video signals using temporal filtering |
| US5689313A (en) * | 1994-03-24 | 1997-11-18 | Discovision Associates | Buffer management in an image formatter |
| US6295321B1 (en) * | 1997-12-29 | 2001-09-25 | Lg Electronics Inc. | Video decoding method, video decoder and digital TV system using the video decoding method and video decoder |
| US6806880B1 (en) * | 2000-10-17 | 2004-10-19 | Microsoft Corporation | System and method for efficiently controlling a graphics rendering pipeline |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02105730A (en) * | 1988-10-14 | 1990-04-18 | Sony Corp | Data recording method |
| KR0185932B1 (en) * | 1995-12-11 | 1999-04-15 | 김광호 | Video data decoding method and apparatus |
| KR100200617B1 (en) * | 1996-09-05 | 1999-06-15 | 윤종용 | A picture synchronizing circuit and method |
| CN1762101A (en) * | 2003-03-20 | 2006-04-19 | 皇家飞利浦电子股份有限公司 | Method of storing information on an optical disc |
| KR100499586B1 (en) * | 2003-05-20 | 2005-07-07 | 엘지전자 주식회사 | Method for managing a copy protection information of high density optical disc and high density optical disc therof and apparatus for detecting a copy protection information |
| TWI266181B (en) * | 2004-04-09 | 2006-11-11 | Mediatek Inc | Apparatus for accessing and outputting optical data |
| JP4091962B2 (en) * | 2004-04-28 | 2008-05-28 | 松下電器産業株式会社 | STREAM GENERATION DEVICE, METHOD, AND RECORDING MEDIUM |
| US7861141B2 (en) * | 2005-10-21 | 2010-12-28 | Mediatek Inc. | Method and device for error analysis of optical disc |
-
2006
- 2006-08-16 US US11/464,845 patent/US20070147517A1/en not_active Abandoned
- 2006-08-16 US US11/464,850 patent/US7738325B2/en active Active
- 2006-12-07 TW TW095145648A patent/TWI342162B/en not_active IP Right Cessation
- 2006-12-25 CN CNB2006101699324A patent/CN100539697C/en not_active Expired - Fee Related
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5539663A (en) * | 1993-11-24 | 1996-07-23 | Intel Corporation | Process, apparatus and system for encoding and decoding video signals using temporal filtering |
| US5689313A (en) * | 1994-03-24 | 1997-11-18 | Discovision Associates | Buffer management in an image formatter |
| US6295321B1 (en) * | 1997-12-29 | 2001-09-25 | Lg Electronics Inc. | Video decoding method, video decoder and digital TV system using the video decoding method and video decoder |
| US6806880B1 (en) * | 2000-10-17 | 2004-10-19 | Microsoft Corporation | System and method for efficiently controlling a graphics rendering pipeline |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080316217A1 (en) * | 2007-06-25 | 2008-12-25 | Microsoft Corporation | Hard/Soft Frame Latency Reduction |
| US8421842B2 (en) * | 2007-06-25 | 2013-04-16 | Microsoft Corporation | Hard/soft frame latency reduction |
| US8971241B2 (en) | 2008-09-30 | 2015-03-03 | Qualcolmm Incorporated | Techniques for supporting relay operation in wireless communication systems |
| US20100080166A1 (en) * | 2008-09-30 | 2010-04-01 | Qualcomm Incorporated | Techniques for supporting relay operation in wireless communication systems |
| WO2010038197A1 (en) * | 2008-09-30 | 2010-04-08 | Nxp B.V. | Profile for frame rate conversion |
| US20110234894A1 (en) * | 2008-09-30 | 2011-09-29 | Trident Microsystems, Inc. | Profile for frame rate conversion |
| US10362264B2 (en) | 2008-09-30 | 2019-07-23 | Entropic Communications, Llc | Profile for frame rate conversion |
| US10075670B2 (en) | 2008-09-30 | 2018-09-11 | Entropic Communications, Llc | Profile for frame rate conversion |
| US9294219B2 (en) * | 2008-09-30 | 2016-03-22 | Qualcomm Incorporated | Techniques for supporting relay operation in wireless communication systems |
| US20100097978A1 (en) * | 2008-10-20 | 2010-04-22 | Qualcomm Incorporated | Data transmission via a relay station in a wireless communication system |
| US9203564B2 (en) | 2008-10-20 | 2015-12-01 | Qualcomm Incorporated | Data transmission via a relay station in a wireless communication system |
| US20120194487A1 (en) * | 2011-01-27 | 2012-08-02 | Wolfgang Roethig | Master Synchronization for Multiple Displays |
| US8754828B2 (en) | 2011-01-27 | 2014-06-17 | Apple Inc. | Master synchronization for multiple displays |
| US8669970B2 (en) * | 2011-01-27 | 2014-03-11 | Apple Inc. | Master synchronization for multiple displays |
| CN103177001A (en) * | 2011-12-21 | 2013-06-26 | 腾讯科技(深圳)有限公司 | Method and device for achieving smooth switching of pictures |
| US20140341307A1 (en) * | 2013-05-20 | 2014-11-20 | Playcast Media Systems, Ltd. | Overcoming lost ip packets in streaming video in ip networks |
| US9407923B2 (en) * | 2013-05-20 | 2016-08-02 | Gamefly Israel Ltd. | Overconing lost IP packets in streaming video in IP networks |
| US10771821B2 (en) | 2013-05-20 | 2020-09-08 | Electronic Arts Inc. | Overcoming lost IP packets in streaming video in IP networks |
| US10937379B2 (en) * | 2017-05-12 | 2021-03-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Display device and driving method thereof |
| US10291936B2 (en) | 2017-08-15 | 2019-05-14 | Electronic Arts Inc. | Overcoming lost or corrupted slices in video streaming |
| US10694213B1 (en) | 2017-08-15 | 2020-06-23 | Electronic Arts Inc. | Overcoming lost or corrupted slices in video streaming |
Also Published As
| Publication number | Publication date |
|---|---|
| TW200729965A (en) | 2007-08-01 |
| TWI342162B (en) | 2011-05-11 |
| CN100539697C (en) | 2009-09-09 |
| US7738325B2 (en) | 2010-06-15 |
| CN1992897A (en) | 2007-07-04 |
| US20070150650A1 (en) | 2007-06-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20070147517A1 (en) | Video processing system capable of error resilience and video processing method for same | |
| US7342967B2 (en) | System and method for enhancing performance of personal video recording (PVR) functions on hits digital video streams | |
| US8275247B2 (en) | Method and apparatus for normal reverse playback | |
| US10382830B2 (en) | Trick play in digital video streaming | |
| US20100118941A1 (en) | Frame accurate switching | |
| US20080155586A1 (en) | Method and device for processing video stream in digital video broadcasting systems | |
| US8170375B2 (en) | Image processing apparatus and method for controlling the same | |
| US8811483B2 (en) | Video processing apparatus and method | |
| US9531983B2 (en) | Decoding interdependent frames of a video for display | |
| US20070098072A1 (en) | Command packet system and method supporting improved trick mode performance in video decoding systems | |
| US7848410B2 (en) | Video decoding methods and devices | |
| US20100086280A1 (en) | Method for smoothly playing a video stream in reverse | |
| JP2008532452A (en) | Buffering video stream data | |
| CN103139641A (en) | Method and device for achieving audio/video seamless switching in real-time digital television time shifting playing | |
| KR20080076079A (en) | Digital broadcast reproduction method and apparatus, digital broadcast recording method | |
| US20100061697A1 (en) | Motion picture decoding method, motion picture decoding device, and electronic apparatus | |
| CN102065320A (en) | Method and equipment for processing trick playing command related to transport stream (TS) code stream | |
| US20100008642A1 (en) | Video apparatus and method thereof | |
| JP2008311784A (en) | Moving picture decoding apparatus and moving picture decoding method | |
| US20090304089A1 (en) | Reproduction processing apparatus, reproduction processing method, and computer program | |
| US7512325B2 (en) | Method and apparatus for MPEG video processing | |
| JP4769268B2 (en) | MPEG video decoder and MPEG video decoding method | |
| US20110299591A1 (en) | Video processing apparatus and method | |
| US8122476B2 (en) | System and method for reducing interlace artifacts during trick mode playback | |
| KR101161604B1 (en) | Method for controlling lip synchronization of video streams and apparatus therefor |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HU, SHIH-CHANG;REEL/FRAME:018115/0931 Effective date: 20060720 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |