[go: up one dir, main page]

WO2004057535A1 - Amelioration d'images video basee sur des ameliorations d'images prealables - Google Patents

Amelioration d'images video basee sur des ameliorations d'images prealables Download PDF

Info

Publication number
WO2004057535A1
WO2004057535A1 PCT/IB2003/005966 IB0305966W WO2004057535A1 WO 2004057535 A1 WO2004057535 A1 WO 2004057535A1 IB 0305966 W IB0305966 W IB 0305966W WO 2004057535 A1 WO2004057535 A1 WO 2004057535A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
regions
video
mapping
motion vectors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2003/005966
Other languages
English (en)
Inventor
Richard Chi-Te Shen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=32682192&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2004057535(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to JP2004561846A priority Critical patent/JP2006511160A/ja
Priority to EP03813681A priority patent/EP1579387A1/fr
Priority to AU2003303269A priority patent/AU2003303269A1/en
Priority to US10/537,889 priority patent/US8135073B2/en
Publication of WO2004057535A1 publication Critical patent/WO2004057535A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding

Definitions

  • the invention relates to the field of video image processing and more specifically to enhancing subsequent images of a video stream in which frames are encoded based on previous frames using prediction and motion estimation.
  • a video stream containing encoded frame based video information is received.
  • the video stream includes an encoded first frame and an encoded second frame.
  • the encoding of the second frame depends on the encoding of the first frame. More specifically, the encoding of the second frame includes motion vectors indicating differences in positions between regions of the second frame and corresponding regions of the first frame, the motion vectors define the correspondence between regions of the second frame and regions of the first frame.
  • the first frame is decoded and a re-mapping strategy for video enhancement of the decoded first frame is determined using a region-based analysis. Regions of the decoded first frame are re-mapped according to the determined video enhancement re-mapping strategy for the first frame so as to enhance the first frame.
  • the motion vectors for the second frame are recovered from the video stream and the second frame is decoded. Then regions of the second frame, that correspond to regions of the first frame, are re-mapped using the video enhancing, region-based, re-mapping strategy for the regions of the first frame so as to enhance the second frame.
  • one or more regions of the second frame are selected depending on whether a similarity criteria is met for a similarity between the regions of the second frame and corresponding regions of the first frame. Then the re- mapping of the regions of the second frame based on the video enhancing region-based remapping strategy for the first frame is only performed for the selected regions of the second frame. Limiting the reuse of the video enhancing re-mapping strategy of previous frames to only regions of the subsequent frames that are sufficiently similar to the previous frame, increases the likelihood that the subsequent frame will be enhanced.
  • a set top box using the decoder of the invention provides enhanced video pictures with minimal additional hardware costs.
  • Using the decoder of the invention for a video disc player allows higher compression of groups of pictures on the video disc with the same perceived quality.
  • a television that uses the decoder of the invention can display higher quality video pictures or utilize a more highly compressed video signal while providing the same quality as a less compressed signal. Additional aspects and advantages of the invention will become readily apparent to those skilled in the art from the detailed description below with reference to the following drawings.
  • Figure 1 illustrates an example method of the invention for region-based enhancing of subsequent video images.
  • Figure 2 shows portions of an example decoder of the invention for providing region-based enhanced subsequent video images.
  • Figure 3 shows portions of a example set top box using the decoder of figure 2.
  • Figure 4 illustrates portions of an example DVD player using the decoder of figure 2.
  • Figure 5 shows portions of an example television using the decoder of figure 2.
  • Figure 1 shows a specific embodiment 100 of the method of the invention.
  • a video stream is received.
  • the stream contains encoded information for groups of pictures
  • the first picture in the GOP is an intra-coded frame (I-frame) and subsequent pictures in the GOP are non-I-frames.
  • the decoding of the subsequent non-I-frames depends on the coding of the I-frame.
  • the video stream may be, for example, an MPEG II stream of packets, in which case, the non-I-frames may be, for example, predicted frames (P-frames), and/or bi-directional frames (B-frames).
  • P-frames predicted frames
  • B-frames bi-directional frames
  • any other type of GOP based video stream may be used as long as it contains subsequent frames that are encoded based on previous frames.
  • the I-frame is decoded. Decoding of I-frames is well known in the art.
  • a re-mapping strategy for re-mapping the intensity values to adjust the contrast is determined so as to enhance the decoded I-frame.
  • the re-mapping strategy may use a region-based intensity analysis. Methods of determining re-mapping strategies for regions of decoded frames using such analysis are well known, and those skilled in the art are directed to US6259472 and US5862254 which disclose such re-mapping of intensity values.
  • the intensity values of the decoded I-frame are re-mapped according to the determined re-mapping strategy.
  • motion vectors for the subsequent non-I-frame are recovered from the video stream as is well known in the art.
  • motion vectors are differences in the positions between regions in an I-frame and corresponding regions in a non-I-frame that is coded dependent on the I-frame.
  • the regions may be regions of similar intensity or regions of similar texture or any other predefined similarity between frames may be used to define regions.
  • DC coefficients for the subsequent non-I-frame are recovered from the video stream as is well known in the art.
  • the DC coefficients are the differences between the values of the image blocks of the I-frame and the predicted values of corresponding image blocks of the non-I-frame, after motion estimation.
  • Motion estimation is generally, the re-mapping of the regions depending on the motion vectors during decoding.
  • the intensity values in the regions of the non-I-frame are re-mapped depending on the re-mapping strategy of corresponding regions of the I-frame so as to adjust the contrast to enhance the non-I-frame.
  • the correspondence between the regions is determined from the motion vectors.
  • a region of the subsequent non-I-frame is more similar to the corresponding region of the I-frame (on which the decoding of the non-I-frame depends), than it is more likely that using the re-mapping strategy, developed for re-mapping the intensity values of the corresponding I-frame region, for re-mapping the intensity values for the non-I-frame region, will enhance the non-I-frame.
  • the region of the non-I-frame is substantially different than the corresponding region of the I-frame, then using the intensity value re-mapping strategy for the corresponding region of the I-frame, for re-mapping the intensity values of the region of the non-I-frame, is not likely to enhance the non-I-frame, and in fact may even reduce the quality of the non-I-frame.
  • any region-based video processing for improving the quality of an I-frame can be applied to corresponding regions of subsequent non-I-frames in a similar manner.
  • the re-mapping of the intensity values of the non-I-frame may also depend on the DC coefficients of the blocks of the regions of the non-I-frame on which the decoding of the non-I-frame depends.
  • small values of the DC coefficients for a region indicate that the region is likely to be similar to the corresponding region in the I-frame after motion compensation.
  • the re-mapping strategy for the intensity values of the I-frame are not used to remap the intensity values of the non-I-frame.
  • a threshold for the DC coefficients which can be a constant predetermined value or a variable value calculated for each region
  • I-frame re-mapping strategy to re-map the intensity values for a region only when the value of the DC coefficients are below the threshold.
  • Those skilled in the art can easily determine either a standard predetermined DC coefficient threshold for regions in a frame, or a method to calculate a DC coefficient thresholds for each region in a frame, that can be used to enhance the frames.
  • a useful DC coefficient threshold can be determined, for example, by a simple trial and error process of comparing frames in which different thresholds or threshold calculation methods have been applied.
  • the re-mapping of the intensity values of the non-I-frame may also depend on the properties of the motion vectors.
  • the motion vectors are used to identify regions of the subsequent non-I-frame, that correspond to regions of the I- frame, in a process called motion compensation.
  • the properties of the motion vectors can also be used to determine the likelihood that the regions of the non-I-frame are similar to the corresponding regions of the I-frame.
  • Each motion vector has a value and a direction. Relationships between the motion vectors of neighboring regions include differences in values and differences in direction called orthogonality.
  • small values for the motion vector for a region, small differences between motion vector values of a region and its neighboring regions, and small differences between motion vector directions of a region and its neighboring regions each indicate that the region is more likely to be similar to the corresponding I-frame region.
  • a small value of the motion vector for a region of the non-I-frame indicates that the region is more likely to be similar to the corresponding region in the I- frame on which its decoding depends.
  • a threshold for motion vector values can be a constant predetermined value or a variable value calculated for each region.
  • the I-frame re-mapping strategy is used to re-map the intensity values for regions only if the values of the respective motion vectors for those regions are below the threshold.
  • a standard predetermined motion vector value threshold for regions in the non-I-frame or a method to calculate a motion vector value threshold for regions in a non- I-frame, that can be used to enhance the non-I-frames.
  • a useful motion vector value threshold can be determined, for example, by a simple trial and error process of comparing non-I-frames in which different thresholds or threshold calculation methods have been applied.
  • consistency in the values of the motion vectors between a region of the non-I- frame and its neighboring regions in the non-I-frame indicates that the region is more likely to be similar to the corresponding region in the I-frame on which the decoding of the non-I- frame depends.
  • the re-mapping strategy for the intensity values of the I-frame is not used to re-map the intensity values of the non-I-frame. This determination can be done, for example, by determining the average difference between the values of motion vectors for regions and the values of the motion vectors of neighboring regions, and then comparing the average differences in value to a value consistency threshold.
  • the value consistency threshold can be a constant predetermined value or a variable value calculated for each region. Then the
  • I-frame re-mapping strategy is used to re-map the intensity values for regions only if the average differences in the values of the motion vectors are below the value consistency threshold.
  • squares of the value differences or other combinations of the value differences or other well know statistical approaches could be used to determine value consistency.
  • those skilled in the art can easily determine either a standard predetermined value consistency threshold for regions in a frame, or a method to calculate value consistency thresholds for regions in a non-I-frame, that can be used to enhance the non-I-frames.
  • a useful value consistency threshold may be determined, for example, by a simple trial and error process of comparing different non-I-frames in which different respective value consistency thresholds or threshold calculation methods have been applied.
  • consistency of motion vector direction between a region and neighboring regions in the non-I-frame indicate that the non-I-frame region is more likely be similar to the corresponding regions in the I-frame on which its decoding depends.
  • the re-mapping strategy for the intensity values of the I-frame for the region is not used to re-map the intensity values of the non-I-frame region. This can be determined, for example, by determining the average difference between the directions of motion vectors for regions and the directions of the motion vectors of neighboring regions, and then comparing the average differences in direction to a direction consistency threshold.
  • the direction consistency threshold can be a constant predetermined value or a variable value calculated for each region. Then the I-frame re-mapping strategy is used to re-map the intensity values for regions only if the average differences in the values of the motion vectors are below the direction consistency threshold. Similarly squares of the direction differences or other combinations of the direction differences or other well know statistical approaches could be used. Again, those skilled in the art can easily determine either a predetermined value for the direction consistency threshold or a method to calculate such a threshold for each region in a frame, that can be used to enhance the non-I-frames. A useful direction consistency threshold or method to calculate such a threshold can easily be determined, for example, by a simple trial and error process of comparing frames in which different thresholds or threshold calculation methods are applied.
  • Multiple indications of similarity may be applied to determine whether to apply the re-mapping strategy of an I-frame to a subsequent non-I-frame whose decoding depends on the I-frame.
  • Those skilled in the art will know how to develop functions that combine multiple indications of similarity to determine whether to apply the I-frame re-mapping strategy to the non-I-frame. For example, they can use the I-frame contrast re-mapping strategy only when all the indications of similarity meet respective threshold requirements.
  • the differences or relative differences between the indications of similarity and their respective thresholds can determine the differences or relative differences between the indications of similarity and their respective thresholds and only apply the I- frame contrast re-mapping strategy to the non-I-frame when the total of the differences or relative differences (or square of the differences or relative differences) is below a further threshold.
  • Those skilled in the art will know how to apply this process to more complex dependencies between frames such as subsequent non-I-frames, whose decoding is dependent on previous non-I-frames, whose decoding is dependent on I-frames. They can, for example, just apply the contrast enhancing re-mapping strategy of the I-frame to such subsequent non-I-frames. Alternatively, they can, for example, develop a second contrast enhancing re-mapping strategy for the previous non-I-frame, which can be applied to the subsequent non-I-frame.
  • FIG. 2 illustrates the basic components of a video decoder 120 of the invention.
  • a video stream of packets containing a group of pictures (GOP) is received at an input 122, the first picture in the GOP is an I-frame and a subsequent picture in the GOP is a non-I-frame.
  • the video stream may be an MPEG stream as described above.
  • a decoding unit 124 decodes the frames of the GOP.
  • the decoding unit provides the decoded I-frame to a buffer 126, to a processing unit 128.
  • Processor 128 uses a region-based intensity analysis to determine a strategy to remap intensity values to change contrast to enhance the I-frame image, and re-maps the intensity values of the I-frame in buffer 126 using the re-mapping strategy.
  • the buffer then passes the contrast enhanced I-frame to output 132 through summation unit 130.
  • the decoding unit recovers the DC coefficients and the motion vectors for the subsequent non-I-frames of the GOP and supplies them to the buffer 126 and processor 128.
  • Processor 128 re-maps the original I-frame and the contrast enhanced I-frame according to the motion vectors.
  • the decoding unit provides the decoded differences between the I-frame and the subsequent non-I-frames to the summation unit 130.
  • the buffer 126 supplies either the motion vector re-mapped I-frame or motion vector re-mapped contrast enhanced I-frame to summation unit 130.
  • the summation unit combines the decoded differences and the re-mapped enhanced I-frame together to produce the decoded subsequent non-I-frame.
  • the selection criteria in this specific example for a region is as follows: DC ⁇ T1 ; and MW ⁇ T2; and MVS ⁇ T3; and MVO ⁇ T4; and al(DC-Tl) 2 + a2(MW-T2) 2 + a3(MVS-T3) 2 + a4(MVO-T4) 2 ⁇ T5
  • DC is the value of the DC coefficients for the region
  • MW is the motion vector value for the region
  • MVS is the average difference between the value of the motion vector and the value of the motion vectors of the regions above, below, and to each side of the region
  • MVO is the orthogonality of the motion vector for the region with respect to the motion vectors of the regions that border the region
  • T1-T5 are predetermined thresholds
  • al-a4 are constants.
  • FIG 3 shows a set top box 140 of the invention.
  • Tuner 142 selects a video stream for a video program from among multiple streams for several different video programs provided at input 144.
  • the video decoder 120 of figure 2 decodes the video program and provides the decoded program to output 146 which can be directed to a video display e.g. a television set.
  • Figure 4 illustrates a DVD player 150 of the invention.
  • the video player has a motor 152 for rotating a video disc 154.
  • a laser 156 produces a radiation beam 158.
  • a servo 160 controls the position of an optical system 162 to scan an information layer of the video disc with a focused spot of the radiation beam.
  • the information layer effects the beam and reflects or transmits the beam to a radiation detector 164 for detecting the beam after it has been effected by the information layer.
  • Processor 166 controls the servo and motor and produces a video stream containing encoded information for a group of pictures (GOP) depending on the detection. Then the video decoder of figure decodes the video stream and supplies the decoded video stream to an output 168 for connection to a display.
  • GOP group of pictures
  • the processor 166 can be the same processor 128 as in the decoder of figure 2 or an additional processor can be provided as shown.
  • Figure 5 shows a television 200 of the invention.
  • a tuner 142 selects a video stream of a video program to be played from a plurality of video streams for respective video programs provided to input 144.
  • the decoder 120 of figure 2 decodes the selected video program and provides it to display 206.
  • the television may have components of the DVD player of figure 4 for playing stored video programs (or recording programs) using the DVD components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

La présente invention a trait à un flux vidéo contenant une information vidéo codée à base de trames comportant une première trame et une deuxième trame. Le codage de la deuxième trame est basé sur le codage de la première trame. Le codage comporte des vecteurs de mouvement indiquant des différences dans des positions entre des zones de la deuxième trame et des zones correspondantes de la première trame, les vecteurs de mouvement définissant la correspondance entre des zones de la deuxième trame et des zones de la première trame. On effectue le décodage de la première trame et on détermine une stratégie de remappage pour l'amélioration de vidéo de la première trame décodée au moyen d'une analyse basée sur des zones. On effectue un remappage des zones de la première trame décodée selon la stratégie de remappage d'amélioration de vidéo déterminée afin d'améliorer la première trame. On récupère les vecteurs de mouvement pour la deuxième trame à partir du flux de vidéo et on effectue le décodage de la deuxième trame.
PCT/IB2003/005966 2002-12-19 2003-12-12 Amelioration d'images video basee sur des ameliorations d'images prealables Ceased WO2004057535A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2004561846A JP2006511160A (ja) 2002-12-20 2003-12-12 先行する画像強化に依存したビデオ画像強化
EP03813681A EP1579387A1 (fr) 2002-12-20 2003-12-12 Amelioration d'images video basee sur des ameliorations d'images prealables
AU2003303269A AU2003303269A1 (en) 2002-12-20 2003-12-12 Enhancing video images depending on prior image enhancements
US10/537,889 US8135073B2 (en) 2002-12-19 2003-12-12 Enhancing video images depending on prior image enhancements

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US43523702P 2002-12-20 2002-12-20
US60/435,237 2002-12-20

Publications (1)

Publication Number Publication Date
WO2004057535A1 true WO2004057535A1 (fr) 2004-07-08

Family

ID=32682192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2003/005966 Ceased WO2004057535A1 (fr) 2002-12-19 2003-12-12 Amelioration d'images video basee sur des ameliorations d'images prealables

Country Status (6)

Country Link
EP (1) EP1579387A1 (fr)
JP (1) JP2006511160A (fr)
KR (1) KR20050084311A (fr)
CN (1) CN1333373C (fr)
AU (1) AU2003303269A1 (fr)
WO (1) WO2004057535A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8135073B2 (en) 2002-12-19 2012-03-13 Trident Microsystems (Far East) Ltd Enhancing video images depending on prior image enhancements
US10150778B2 (en) 2012-09-28 2018-12-11 Takeda Pharmaceutical Company Limited Production method of thienopyrimidine derivative

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5673032B2 (ja) * 2010-11-29 2015-02-18 ソニー株式会社 画像処理装置、表示装置、画像処理方法及びプログラム
US8768069B2 (en) * 2011-02-24 2014-07-01 Sony Corporation Image enhancement apparatus and method
CN104683798B (zh) * 2013-11-26 2018-04-27 扬智科技股份有限公司 镜射影像编码方法及其装置、镜射影像解码方法及其装置
CN106954055B (zh) * 2016-01-14 2018-10-16 掌赢信息科技(上海)有限公司 一种视频亮度调节方法和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5862254A (en) * 1996-04-10 1999-01-19 Samsung Electronics Co., Ltd. Image enhancing method using mean-matching histogram equalization and a circuit therefor
US6157396A (en) * 1999-02-16 2000-12-05 Pixonics Llc System and method for using bitstream information to process images for use in digital display systems
US6259472B1 (en) * 1996-06-20 2001-07-10 Samsung Electronics Co., Ltd. Histogram equalization apparatus for contrast enhancement of moving image and method therefor
US6385248B1 (en) * 1998-05-12 2002-05-07 Hitachi America Ltd. Methods and apparatus for processing luminance and chrominance image data
US20030206591A1 (en) * 2002-05-06 2003-11-06 Koninklijke Philips Electronics N.V. System for and method of sharpness enhancement for coded digital video

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3630590B2 (ja) * 1999-08-25 2005-03-16 沖電気工業株式会社 復号化装置及び伝送システム
JP2003348488A (ja) * 2002-05-30 2003-12-05 Canon Inc 画像表示システム及び画像表示方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5862254A (en) * 1996-04-10 1999-01-19 Samsung Electronics Co., Ltd. Image enhancing method using mean-matching histogram equalization and a circuit therefor
US6259472B1 (en) * 1996-06-20 2001-07-10 Samsung Electronics Co., Ltd. Histogram equalization apparatus for contrast enhancement of moving image and method therefor
US6385248B1 (en) * 1998-05-12 2002-05-07 Hitachi America Ltd. Methods and apparatus for processing luminance and chrominance image data
US6157396A (en) * 1999-02-16 2000-12-05 Pixonics Llc System and method for using bitstream information to process images for use in digital display systems
US20030206591A1 (en) * 2002-05-06 2003-11-06 Koninklijke Philips Electronics N.V. System for and method of sharpness enhancement for coded digital video

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ATZORI L ET AL: "Post-processing for real-time quality enhancement of MPEG-coded video sequences", MULTIMEDIA AND EXPO, 2000. ICME 2000. 2000 IEEE INTERNATIONAL CONFERENCE ON NEW YORK, NY, USA 30 JULY-2 AUG. 2000, PISCATAWAY, NJ, USA,IEEE, US, 30 July 2000 (2000-07-30), pages 975 - 978, XP010513172, ISBN: 0-7803-6536-4 *
BOYCE J ET AL: "LOW-COST ALL FORMAT ATV DECODING WITH IMPROVED QUALITY", SMPTE ADVANCED MOTION IMAGING CONFERENCE, XX, XX, 2 February 1996 (1996-02-02), pages 45 - 51, XP001006067 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8135073B2 (en) 2002-12-19 2012-03-13 Trident Microsystems (Far East) Ltd Enhancing video images depending on prior image enhancements
US10150778B2 (en) 2012-09-28 2018-12-11 Takeda Pharmaceutical Company Limited Production method of thienopyrimidine derivative

Also Published As

Publication number Publication date
CN1333373C (zh) 2007-08-22
EP1579387A1 (fr) 2005-09-28
AU2003303269A1 (en) 2004-07-14
KR20050084311A (ko) 2005-08-26
JP2006511160A (ja) 2006-03-30
CN1729482A (zh) 2006-02-01

Similar Documents

Publication Publication Date Title
US8135073B2 (en) Enhancing video images depending on prior image enhancements
US20050243920A1 (en) Image encoding/decoding device, image encoding/decoding program and image encoding/decoding method
US5883674A (en) Method and apparatus for setting a search range for detecting motion vectors utilized for encoding picture data
JP3633159B2 (ja) 動画像信号符号化方法及び装置、並びに動画像信号伝送方法
US8045611B2 (en) Video processing and recording apparatuses and methods
US20050141613A1 (en) Editing of encoded a/v sequences
JP2009535881A (ja) エンコード/トランスコード及びデコードのための方法及び装置
EP1579387A1 (fr) Amelioration d'images video basee sur des ameliorations d'images prealables
US7124298B2 (en) Method and device for detecting a watermark
JP3888533B2 (ja) 画像特徴に応じた画像符号化装置
JP2007525920A (ja) ビデオ信号エンコーダ、ビデオ信号プロセッサ、ビデオ信号配信システム及びビデオ信号配信システムの動作方法
JP4295861B2 (ja) トランスコーダ装置
JPH08336135A (ja) 画像圧縮装置及び方法
US7062102B2 (en) Apparatus for re-coding an image signal
JPH10174094A (ja) 映像復号化装置
JP2000261809A (ja) 画像特徴に応じた画像符号化装置
JP4288897B2 (ja) 符号化装置及び方法、プログラム、記録媒体
US6754270B1 (en) Encoding high-definition video using overlapping panels
US20080025408A1 (en) Video encoding
JPH0537786A (ja) 画像データエンコーダおよびデコーダ
US20040101051A1 (en) Image processing apparatus and method for processing motion-picture data and still-image data
WO2005036886A1 (fr) Codage video a deux passes
CN1798346A (zh) 用于再隐藏包括在解码的图像中的错误的方法和设备
JP3407872B2 (ja) 付加情報検出方法及び装置
JPH05344491A (ja) フレーム間予測符号化方式

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003813681

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2006153287

Country of ref document: US

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 10537889

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 1020057010957

Country of ref document: KR

Ref document number: 2004561846

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 20038A71150

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 1020057010957

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2003813681

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 10537889

Country of ref document: US