WO2009034419A1 - System and method for characterizing data - Google Patents
System and method for characterizing data Download PDFInfo
- Publication number
- WO2009034419A1 WO2009034419A1 PCT/IB2007/053660 IB2007053660W WO2009034419A1 WO 2009034419 A1 WO2009034419 A1 WO 2009034419A1 IB 2007053660 W IB2007053660 W IB 2007053660W WO 2009034419 A1 WO2009034419 A1 WO 2009034419A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- asset
- frame
- fragment
- mass center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7847—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
- G06F16/785—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using colour or luminescence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/48—Matching video sequences
Definitions
- the present invention relates to data characterization systems, and particularly to data characterization systems using color mass.
- Content referred to herein may comprise video, still photographs, or other appropriate content.
- the present invention seeks to provide an improved method for comparing ad characterizing data.
- a method for matching assets including providing a first asset, the first asset including at least one frame, providing a second asset, the second asset including at least one frame, locating a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color, locating a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color, comparing the first at least one color mass center and the second at least one color mass center, and determining, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, producing an indication suitable for post-processing, based, at least in part on the determining, that the second asset matches the first asset.
- the comparing includes calculating a second moments matrix for the first at least one color mass center in the first asset, calculating a second moments matrix for the second at least one color mass center in the second asset, and comparing the second moments matrix for the first asset and the second moments matrix for the second asset.
- the comparing further includes calculating a first eigenvalue, denoted ⁇ , and a second eigenvalue, denoted X ⁇ * f° r tne second moments matrix of the first asset, calculating a third eigenvalue, denoted ⁇ o, and a fourth eigenvalue, denoted ⁇ for the second moments matrix of the second asset, comparing the eigenvalues, ⁇ i and ⁇ o, of the second moments matrix of the first asset with the eigenvalues, ⁇ and ⁇ of the second moments matrix of the second asset and producing a result.
- the providing the second asset includes capturing the second asset.
- ⁇ and ⁇ are stored.
- ⁇ and ⁇ are stored in a database.
- the comparing includes comparing at least X ⁇ l ⁇ with ⁇ / ⁇ .
- the first asset includes a still picture.
- first asset includes a video.
- second asset includes a still picture.
- the second asset includes a video.
- locating the first at least one color mass center for at least one color is performed by performing the following steps for at least one frame in the first asset, summing all color values for the at least one color for each pixel in the at least one frame, the sum denoted S, for at least one frame in the first asset, weighting each color value for each pixel in the frame by a pixel-associated x- coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,y) * x, representing a weighted value is determined, the value denoted J(x,y), determining an x-coordinate center of mass, denoted X, by dividing J(x,y) by S, for the at least one frame, weighting each color value for each pixel in the at least one frame by a pixel-associated y-coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,
- the comparing the second moments of the first asset with the second moments matrix of the second asset includes comparing ⁇ i and ⁇ for each at least one color.
- the at least one color includes a Red-Green-Blue color element.
- the at least one color includes a chrominance / luminance color element.
- the chrominance / luminance color element includes a YCbCr chrominance / luminance color element. Additionally in accordance with a preferred embodiment of the present invention the chrominance / luminance color element includes a YPbPr chrominance / luminance color element.
- the chrominance / luminance color element includes a YDbDr chrominance / luminance color element.
- the chrominance / luminance color element includes a xvYCC chrominance / luminance color element. Still further in accordance with a preferred embodiment of the present invention the at least one color includes a gray scale value.
- the post-processing includes identifying a source of the second asset. Moreover in accordance with a preferred embodiment of the present invention and further including billing the source of the second asset.
- a method for matching assets including providing a reference database, the reference database including a plurality of video clips, providing a video asset, the video asset including a plurality of video frames, subdividing the video asset into a plurality of fragments, for at least one fragment among the plurality of fragments, identifying a plurality of candidate fragments included in the reference database, a candidate fragment including at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment, comparing each one of the plurality of candidate fragments to the at least one fragment, and identifying, as a result of the comparing, matching assets, and producing an indication suitable for post-processing, based, at least in part on the identifying, that matching assets have been identified.
- the providing a video asset includes capturing a video asset.
- the identifying a plurality of candidate fragments includes performing Discreet Wave Analysis (DWA).
- DWA Discreet Wave Analysis
- the DWA is performed for each at least one fragment among the plurality of fragments, and each wavelet in the reference database is analyzed.
- each fragment is represented by a discrete signal denoted D(t)
- each candidate fragment is wavelet of a mother wavelet, denoted ⁇ (t)
- ⁇ denotes a beginning of an interval
- c denote constants
- t ranges from 0.5 seconds to 3 seconds.
- the comparing includes comparing using a correlation function, denoted C.
- C is invariant to affine transformations with positive coefficients.
- candidate fragment as X(i)
- representing candidate fragment as X(i) includes determining a vector including average Luma (Y) for each frame, /, in candidate fragment X.
- the representing fragment as Y(i) includes determining a vector including average Luma (Y) for each frame, /, in fragment Y. Additionally in accordance with a preferred embodiment of the present invention the representing candidate fragment as X(i), includes determining a vector including an average chrominance component for each frame, i, in candidate fragment X. Moreover in accordance with a preferred embodiment of the present invention the representing fragment as Y(i), includes determining a vector including average chrominance component for each frame, i, in fragment Y.
- the chrominance component includes a CbCr chrominance color element.
- the chrominance component includes a PbPr chrominance color element.
- the chrominance component includes a DbDr chrominance color element.
- the chrominance component includes a xvYCC chrominance color element. Further in accordance with a preferred embodiment of the present invention and including calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues ⁇ and ⁇ ⁇ wherein the representing candidate fragment as X(i), includes determining a vector including an a plurality of eigenvalues ⁇ i and ⁇ o.
- a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues ⁇ i and ⁇ o, wherein the representing candidate fragment as X(i), includes choosing an angle ⁇ between a horizontal axis, Ox, and one eigenvector corresponding to ⁇ and determining a vector including a plurality of angles ⁇ .
- the post-processing includes identifying a source of at least one of the matching assets.
- a system for matching assets including a first asset including at least one frame, a second asset including at least one frame, a color mass center locator operative to locate a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color, and a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color, a comparator, the comparator comparing the first at least one color mass center and the second at least one color mass center, and a processor operative to determine, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, to produce an indication suitable for postprocessing, based, at least in part on the determining, that the second asset matches the first asset.
- a system for matching assets including a reference database, the reference database including a plurality of video clips, a video asset, the video asset including a plurality of video frames, a video fragmenter operative to subdivide the video asset into a plurality of fragments, a first processor operative to identify, for at least one fragment among the plurality of fragments, a plurality of candidate fragments included in the reference database, a candidate fragment including at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment, a comparator comparing each one of the plurality of candidate fragments to the at least one fragment, and a second processor operative to identify, as a result of the comparing, matching assets, and to produce an indication suitable for post-processing, based, at least in part on the identifying, that matching assets have been identified.
- FIG. 1 is a simplified illustration of a graphical data characterization and detection system constructed and operative in accordance with a preferred embodiment of the present invention
- FIG. 2 is a simplified illustration of one preferred embodiment of the system of Fig. 1;
- Fig. 3 is a simplified drawing of a typical graphical frame comprising data to be characterized, within the system of Fig. 1;
- Fig. 4 is a depiction of a preferred method of determining a color mass center for one of a plurality of color elements, according to the system of Fig. 1; and Figs. 5 - 8 are simplified flowcharts of preferred methods of operation of the system of Fig. 1.
- Fig. 1 is a simplified pictorial illustration of a graphical data characterization and detection system constructed and operative in accordance with a preferred embodiment of the present invention.
- the system of Fig. 1 comprises a data characterizer 10, a database 20, and a comparison module 30.
- the data characterizer 10 receives an input comprising graphical data 40a.
- the graphical data 40a comprises at least one frame of graphical data.
- the graphical data 40a may comprise video data, and thus comprise a series of frames, depicted as F ⁇ , F2, ... , F n , ....
- the graphical data 40a may comprise a still digital photograph, and thus, the graphical data 40a comprises only a single frame, Fi . It is noted that although Fig.
- FIG. 1 depicts graphical data 40a (as well as the as yet unintroduced graphical data 40b and graphical data 40c) as comprising a plurality of frames, said depiction is not meant to be limiting, and the graphical data (in all of its various instances: 40a, 40b, and 40c) may, as mentioned previously, comprise only a single frame.
- the graphical data 40a, 40b is also referred to as an "asset".
- the data characterizer 10 performs several operations on the graphical data 40a, as described below in detail, with reference to Fig. 4. As a result of the operations performed by the data characterizer 10 on the graphical data 40a, a second moments matrix characterization 50a is output by the data characterizer 10 to an appropriate storage unit, such as the database 20.
- each graphical frame F ⁇ , ⁇ 2, ⁇ ⁇ ⁇ , F n , ... is associated with a characteristic second moments matrix.
- Each characteristic second moments matrix will have two real and non-negative eigenvalues, hereinafter denoted ⁇ and ⁇
- the second moments matrix characterization 50a for the graphical data 40a will be stored in the database 20 with a unique identifier 60a.
- a plurality of second moments matrix characterizations 50 associated with for the graphical data 40a are stored in the database 20 with each individual eigenvalue characterizations associated with one of a plurality of unique identifiers 60.
- the graphical data 40a remains unchanged as a result of the operations performed by the data characterizer 10 on the graphical data 40a. Nevertheless, for ease of depiction, the graphical data 40a, after the operations performed by the data characterizer 10 is depicted as graphical data 40b.
- a suspect graphical data 40c is captured from some content sharing network 70, a website (not depicted), or some via other appropriate method of distributing data.
- the suspect graphical data 40c is input into the data characterizer 10, in order to derive a second moments matrix characterization 50b associated with the suspect graphical data 40c.
- the comparison module 30 performs a comparison routine on second moments matrix characterization 50a and second moments matrix characterization 50b, in order to determine if the second moments matrix characterization 50a and second moments matrix characterization 50b are the same.
- the comparison module 30 produces an appropriate indication, suitable for post-processing, that the graphical data 40a and the graphical data 40c are different instantiations of a single unit of graphical data 40a, 40c. It is appreciated that typical post processing actions include, but are not limited to identifying a source of the suspect graphical data 40c, and possibly thereinafter billing and enforcing copyright on the suspect graphical data 40c.
- the comparison module 30 determines that the graphical data 40a and the graphical data 40c are not different instantiations of a single unit of graphical data 40a, 40c (depicted in Fig. 1 as "Does Not Match” 90), and no further action occurs.
- Fig. 2 is a simplified illustration of one preferred embodiment of the system of Fig. 1.
- the second moments matrix characterization 50, 50a, 50b as depicted in Fig. 1 comprises a characterization performed as an eigenvalue characterization 250, 250a, 250b.
- each characteristic second moments matrix will have two real and non-negative eigenvalues, hereinafter denoted ⁇ and ⁇
- the eigenvalue characterization 250a for the graphical data 40a will be stored in the database 20 with a unique identifier 60a.
- a plurality of eigenvalue characterizations 250 associated with for the graphical data 40a are stored in the database 20 with each individual eigenvalue characterizations associated with one of a plurality of unique identifiers 60.
- the suspect graphical data 40c is input into the data characterizer 10, in order to derive an eigenvalue characterization 250b associated with the suspect graphical data 40c.
- the comparison module 30 performs a comparison routine on eigenvalue characterization 250a and eigenvalue characterization 250b, in order to determine if the eigenvalue characterization 250a and eigenvalue characterization 250b are the same.
- the comparison module 30 may be done by any appropriate means, including, but not limited to analyzing a 2-norm distance (also known as "Euclidean distance") that is: Specifically, considering Ax and Ay as vectors, thus, if Ax is defined as: for a series A ⁇ and A2, where A ⁇ and A2 are series of eigenvalue characterization 250a and eigenvalue characterization 250b for a plurality of frames F ⁇ , ⁇ 2, ..., F n
- the system of Fig. 1 may notify a human operator that the human operator should examine the suspect graphical data 40c with one or more of the closest-matching records in the database. The human operator may then perform a visual comparison to choose the matching database record, or to reject the match.
- Fig. 3 is a simplified drawing of a typical graphical frame comprising data to be characterized, within the system of Fig. 1.
- every frame comprising data which is to be characterized comprises a plurality of pixels.
- Each of the plurality of pixels may be represented a comprising a tuple, the tuple representing a group of color elements in the pixel.
- R,G,B where R stands for red, G stands for green, and B stands for blue.
- each of the plurality of pixels may be represented as comprising a value between 0 and 255.
- pixel color may alternatively be expressed in any appropriate color space, such as any of the well known Chrominance / Luminance systems (for instance, YCbCr; YPbPr; YDbDr), or according to the xvYCC standard, IEC 61966-2-4.
- Chrominance / Luminance systems for instance, YCbCr; YPbPr; YDbDr
- pixel color is expressed herein, in a non-limiting manner, as a RGB triplet.
- the frame may comprise a black and white picture.
- a pixel may be depicted as comprising only one color element in a gray scale, or any other appropriate color space used to characterize black and white pictures.
- H frame height in pixels p (x, y) a pixel's position relative to center. E.g. the top-left pixel is (-W/2, -H/2).
- Fig. 4 is a depiction of a preferred method of determining a color mass center for one of a plurality of color elements, according to the system of Fig. 1.
- a preferred method of calculating the color mass center is to determine values of X and Y, X and Y being coordinates of the color mass center for one of a plurality of color elements in the typical graphical frame of Fig. 3:
- the second moments matrix for the typical graphical frame of Fig. 3 is: ,
- Equation 2 Three such matrices may be determined, one for each color element, R, G, and B.
- the second-moments of color mass matrix is symmetric and positive-definite.
- ⁇ i and ⁇ o there will be two real and non-negative eigenvalues of matrix A, denoted ⁇ i and ⁇ o.
- a ratio of X 1 and ⁇ 2 may be used for detection purposes. Since both X 1 and ⁇ 2 are divided by the same quantity, in the above example, e:
- Equation 5 Computing X 1 and X 2 for each color component and each frame in the video produces a series of 6-tuples of real values. To identify any video as a known asset, a similar series must be stored in the database for every known asset.
- Comparison of 6-tuple series can be performed by the comparison module comparison module 30, using any appropriate method, for example and without limiting the generality of the foregoing, using accepted vector-distance metrics such 2-norm distance. Those skilled in the art will appreciate that any best match found can be verified manually by an operator.
- the 6-tuple series is unsynchronized (offset in time by an unknown shift), and thus, direct comparison will not work.
- string matching it is possible to use string matching to find both the corresponding known asset and the time offset.
- actual values comprising the 6-tuple series and a corresponding reference series must be rounded to a pre-determined scale, for example and without limiting the generality of the foregoing, to the nearest integer, or alternatively, to the nearest hundredth.
- the corresponding reference series comprising the records 50 and 60 (Fig. 1), and 250 and 60 (Fig. 2) in the database
- Table 1 below is an exemplary 5 pixel x 5 pixel graphical frame. For each pixel, a red (R), green (G), and blue (B) color value is indicated.
- the pixel located at (x,y) coordinate (2,3) has RGB values of: 236, 100, and 128.
- each red color value where the red color value is weighted by the y-coordinate of the pixel (i.e. R(x,y) * y), therefore, is:
- the coordinates (X, Y) are the coordinates of the color mass center.
- (1.88, 2.06) is the red color mass center of the five pixel by five pixel frame described in Table 1.
- Table 5 tabulates the red color value, as given in Table 1, weighted by (x - X) * (y - Y): TABLE 5
- Table 7 tabulates the red color value, as given in Table 1, weighted by (x - X)2.
- the values of (x - X) are those given across the top row of
- Table 9 tabulates the red color value, as given in Table 1, weighted by (y - Y)2.
- the values of (y - Y) are those given down the leftmost column of Table 5.
- the values of (y - Y) and (y - Y)2 are tabulated in Table 8.
- Video aspect ratios are standard, and it is easy to restore the original aspect before 6-tuple extraction.
- content characterized in the database comprises content in standard aspect ratios, that is either an aspect ratio of 3:4 or an aspect ratio of 9:16, as is well known in the art.
- the captured content item can be forced into a standard aspect ratio using techniques well known in the art, in order to increase chances of finding a match.
- Rotation - The characterization technique described herein is resilient to rotation, because applying rotation to the picture does not change the color of the pixels in the picture, therefore, matrix A's eigenvalues remain unchanged.
- the unchanging nature of the eigenvalues can be mathematically proven using the properties of bilinear forms, to which the matrix belongs. Specifically, rotating a graphical frame is equivalent to applying a rotation matrix to the left of the second moments matrix, that is to say, multiplying the second moments matrix by another matrix.
- multiplication by a rotation matrix does not alter the eigenvalues of a subject matrix.
- Cropping - Cropping the source graphic may cause a loss of color information, resulting in a change of the values of X 1 and X 2 in the encoded data.
- the loss of information will be proportional to the drop in perceived video quality and a loss of similarity to the original.
- a video clip undergoing analysis is subdivided into a plurality of small fragments. For each fragment, a number of candidate fragments in the database are identified as possibly matching any one of the plurality of small fragments. In a second stage, every identified candidate fragment is compared to every fragment. All fragments found to be matching are considered to be substantially identical to a matching fragment.
- Identification of candidate fragments is performed as follows. If
- DWA Discreet Wave Analysis
- Allowing for an offset, denoted a, a step, denoted s, and a duration, denoted T, the step s comprising an integer, determining a correlation step, such that when 5 1, every frame is checked; when 5>1, only 1 out of s frames are checked, then:
- An average value of Luma is determined over an entire video frame.
- Y in the YUV model
- changes in brightness or contrast affect Y as an affine transformation.
- Y also withstands screen rotations.
- vectors of Y can be compared using the correlation function C, of equation 8.
- vectors of either chrominance component, U or V can also be compared using the correlation function C, of equation 8.
- any other appropriate well known Chrominance / Luminance systems mentioned above are also subject to comparison using the correlation function C, of equation 8.
- Figs. 5 - 8 are simplified flowcharts of preferred methods of operation of the system of Fig. 1.
- Figs. 5 - 8 are believed to be self-explanatory in light of the above discussion. It is appreciated that software components of the present invention may, if desired, be implemented in ROM (read only memory) form. The software components may, generally, be implemented in hardware, if desired, using conventional techniques.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
A method and system for matching video or image assets, the method providing a first asset, including at least one video frame, a second asset, including at least one video frame, locating a first at least one color mass center (i.e a first ordet moment of a coloror intensity component), being a color mass center in the first asset for at least one color, locating a second at least one color mass center, in the second asset, comparing the first color mass center and the second one, and determining, based, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, producing an indication suitable for post-processing activities like enforcing copyright or billing, based, at least in part on the determining, that the second asset matches the first asset.
Description
SYSTEM AND METHOD FOR CHARACTERIZING DATA
FIELD OF THE INVENTION
The present invention relates to data characterization systems, and particularly to data characterization systems using color mass.
BACKGROUND OF THE INVENTION
With the recent advances in Internet content distribution, including peer-to-peer networks and real-time video streaming systems, in order to prevent unauthorized distribution of content, it becomes important to ease detection of copyrighted material. Methods and systems for identifying that content items available on websites, content sharing networks, or other distribution systems are becoming more and more important.
Content referred to herein may comprise video, still photographs, or other appropriate content.
The disclosures of all references mentioned throughout the present specification, as well as the disclosures of all references mentioned in those references, are hereby incorporated herein by reference.
SUMMARY OF THE INVENTION
The present invention seeks to provide an improved method for comparing ad characterizing data.
There is thus provided in accordance with a preferred embodiment of the present invention a method for matching assets, the method including providing a first asset, the first asset including at least one frame, providing a second asset, the second asset including at least one frame, locating a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color, locating a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color, comparing the first at least one color mass center and the second at least one color mass center, and determining, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, producing an indication suitable for post-processing, based, at least in part on the determining, that the second asset matches the first asset.
Further in accordance with a preferred embodiment of the present invention the comparing includes calculating a second moments matrix for the first at least one color mass center in the first asset, calculating a second moments matrix for the second at least one color mass center in the second asset, and comparing the second moments matrix for the first asset and the second moments matrix for the second asset.
Still further in accordance with a preferred embodiment of the present invention the comparing further includes calculating a first eigenvalue, denoted λγ, and a second eigenvalue, denoted X^* f°r tne second moments matrix of the first asset, calculating a third eigenvalue, denoted λo, and a fourth eigenvalue, denoted λφ for the second moments matrix of the second asset, comparing the eigenvalues, λi and λo, of the second moments matrix of the first asset with the eigenvalues, λ^ and λφ of the second moments matrix of the second asset and producing a result.
Additionally in accordance with a preferred embodiment of the present invention the providing the second asset includes capturing the second asset.
Moreover in accordance with a preferred embodiment of the present invention, for each color mass center for each color in each frame of a plurality of frames in the first asset, λγ and ^ are stored.
Further in accordance with a preferred embodiment of the present invention λγ and ^ are stored in a database.
Still further in accordance with a preferred embodiment of the present invention the comparing includes comparing at least Xχl^ with λ^/λ^.
Additionally in accordance with a preferred embodiment of the present invention the first asset includes a still picture.
Moreover in accordance with a preferred embodiment of the present invention first asset includes a video. Further in accordance with a preferred embodiment of the present invention second asset includes a still picture.
Still further in accordance with a preferred embodiment of the present invention the second asset includes a video.
Additionally in accordance with a preferred embodiment of the present invention locating the first at least one color mass center for at least one color is performed by performing the following steps for at least one frame in the first asset, summing all color values for the at least one color for each pixel in the at least one frame, the sum denoted S, for at least one frame in the first asset, weighting each color value for each pixel in the frame by a pixel-associated x- coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,y) * x, representing a weighted value is determined, the value denoted J(x,y), determining an x-coordinate center of mass, denoted X, by dividing J(x,y) by S, for the at least one frame, weighting each color value for each pixel in the at least one frame by a pixel-associated y-coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,y) * y, representing a weighted value is determined, the value denoted
K(x,y), and determining an y-coordinate center of mass, denoted Y, by dividing K(x,y) by S.
Moreover in accordance with a preferred embodiment of the present invention and including for the at least one frame and for the at least one color, tabulating, for each color value of each pixel, a table of weighted pixel values, such that the sum of resultant weighted color values divided by S are denoted
Further in accordance with a preferred embodiment of the present
invention including determining a matrix A, such that
Still further in accordance with a preferred embodiment of the present invention and including determining the eigenvalues λi and λ^ of the matrix A for each at least one color.
Additionally in accordance with a preferred embodiment of the present invention the comparing the second moments of the first asset with the second moments matrix of the second asset includes comparing λi and λ^ for each at least one color.
Moreover in accordance with a preferred embodiment of the present invention the at least one color includes a Red-Green-Blue color element.
Further in accordance with a preferred embodiment of the present invention the at least one color includes a chrominance / luminance color element.
Still further in accordance with a preferred embodiment of the present invention the chrominance / luminance color element includes a YCbCr chrominance / luminance color element.
Additionally in accordance with a preferred embodiment of the present invention the chrominance / luminance color element includes a YPbPr chrominance / luminance color element.
Moreover in accordance with a preferred embodiment of the present invention the chrominance / luminance color element includes a YDbDr chrominance / luminance color element.
Further in accordance with a preferred embodiment of the present invention the chrominance / luminance color element includes a xvYCC chrominance / luminance color element. Still further in accordance with a preferred embodiment of the present invention the at least one color includes a gray scale value.
Additionally in accordance with a preferred embodiment of the present invention the post-processing includes identifying a source of the second asset. Moreover in accordance with a preferred embodiment of the present invention and further including billing the source of the second asset.
Further in accordance with a preferred embodiment of the present invention and further including enforcing copyright on the second asset.
There is also provided in accordance with a another preferred embodiment of the present invention a method for matching assets, the method including providing a reference database, the reference database including a plurality of video clips, providing a video asset, the video asset including a plurality of video frames, subdividing the video asset into a plurality of fragments, for at least one fragment among the plurality of fragments, identifying a plurality of candidate fragments included in the reference database, a candidate fragment including at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment, comparing each one of the plurality of candidate fragments to the at least one fragment, and identifying, as a result of the comparing, matching assets, and producing an indication suitable for post-processing, based, at least in part on the identifying, that matching assets have been identified.
Further in accordance with a preferred embodiment of the present invention the providing a video asset includes capturing a video asset.
Still further in accordance with a preferred embodiment of the present invention the identifying a plurality of candidate fragments includes performing Discreet Wave Analysis (DWA).
Additionally in accordance with a preferred embodiment of the present invention the DWA is performed for each at least one fragment among the plurality of fragments, and each wavelet in the reference database is analyzed.
Moreover in accordance with a preferred embodiment of the present invention each fragment is represented by a discrete signal denoted D(t), each candidate fragment is wavelet of a mother wavelet, denoted φ(t), τ denotes a beginning of an interval, c and A denote constants, then, for a time interval, denoted t, D(τ + t) = AΦ(ct) a candidate fragment is identified.
Further in accordance with a preferred embodiment of the present invention t ranges from 0.5 seconds to 3 seconds.
Still further in accordance with a preferred embodiment of the present invention A is large.
Additionally in accordance with a preferred embodiment of the present invention the comparing includes comparing using a correlation function, denoted C.
Moreover in accordance with a preferred embodiment of the present invention further including representing candidate fragment as X(i), and a fragment as Y(i), X(i) and Y(i) each including a one-dimensional discrete video signal, providing an offset, denoted a, a step, denoted s, the step s including an integer, determining a correlation step, such that when 5=1, every frame is checked and when 5>1, only 1 out of s frames are checked, a duration, denoted T,
T
N =
S
1 N-i
~X =
' N i=0
N-i
Y = 1
TV i=0
N-i
C T r> - /=0 σxσy
Further in accordance with a preferred embodiment of the present invention, Ca s j (X, Y) ≤ 1 .
Still further in accordance with a preferred embodiment of the present invention, C is invariant to affine transformations with positive coefficients.
Additionally Further in accordance with a preferred embodiment of the present invention candidate fragment X(i) and the fragment Y(i) do not match if C = O.
Moreover in accordance with a preferred embodiment of the present invention candidate fragment X(i) and the fragment Y(i) do match if C =1.
Further in accordance with a preferred embodiment of the present invention representing candidate fragment as X(i), includes determining a vector including average Luma (Y) for each frame, /, in candidate fragment X.
Still further in accordance with a preferred embodiment of the present invention the representing fragment as Y(i), includes determining a vector including average Luma (Y) for each frame, /, in fragment Y.
Additionally in accordance with a preferred embodiment of the present invention the representing candidate fragment as X(i), includes determining a vector including an average chrominance component for each frame, i, in candidate fragment X. Moreover in accordance with a preferred embodiment of the present invention the representing fragment as Y(i), includes determining a vector including average chrominance component for each frame, i, in fragment Y.
Further in accordance with a preferred embodiment of the present invention the chrominance component includes a CbCr chrominance color element.
Still further in accordance with a preferred embodiment of the present invention the chrominance component includes a PbPr chrominance color element.
Additionally in accordance with a preferred embodiment of the present invention the chrominance component includes a DbDr chrominance color element.
Moreover in accordance with a preferred embodiment of the present invention the chrominance component includes a xvYCC chrominance color element. Further in accordance with a preferred embodiment of the present invention and including calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λγ and ^^ wherein the representing candidate fragment as X(i), includes determining a vector including an a plurality of eigenvalues λi and λo.
Still further in accordance with a preferred embodiment of the present invention and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y, deriving, based on the color
mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λ^ and λφ wherein the representing candidate fragment as Y(i), includes determining a vector including an a plurality of eigenvalues λ^ and λφ
Additionally in accordance with a preferred embodiment of the present invention and including calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λγ and X^) wherein the representing candidate fragment as X(i), includes determining a vector including a plurality of ratios of eigenvalues λγ and
Moreover in accordance with a preferred embodiment of the present invention and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λo and λφ wherein the representing candidate fragment as Y(i), includes determining a vector including a plurality of ratios of eigenvalues λo and λφ λ^/λφ
Further in accordance with a preferred embodiment of the present invention and including calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λi and λo, wherein the representing candidate fragment
as X(i), includes choosing an angle φ between a horizontal axis, Ox, and one eigenvector corresponding to λγ and determining a vector including a plurality of angles φ.
Still further in accordance with a preferred embodiment of the present invention and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y, deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y, evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating including determining eigenvalues λo and λφ wherein the representing candidate fragment as Y(i), includes choosing an angle φ between a horizontal axis, Ox, and one eigenvector corresponding to λo and determining a vector including a plurality of angles φ.
Additionally in accordance with a preferred embodiment of the present invention the post-processing includes identifying a source of at least one of the matching assets.
Moreover in accordance with a preferred embodiment of the present invention, and further including billing the source of one of the matching assets.
Further in accordance with a preferred embodiment of the present invention and further including enforcing copyright on at least one of the matching assets.
There is also provided in accordance with still another preferred embodiment of the present invention a system for matching assets, the system including a first asset including at least one frame, a second asset including at least one frame, a color mass center locator operative to locate a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color, and a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color, a comparator, the comparator comparing the first at least one color mass center and the second at least one color mass center, and a processor operative to determine, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different
instantiations of a single asset and, if so, to produce an indication suitable for postprocessing, based, at least in part on the determining, that the second asset matches the first asset.
There is also provided in accordance with still another preferred embodiment of the present invention a system for matching assets, the system including a reference database, the reference database including a plurality of video clips, a video asset, the video asset including a plurality of video frames, a video fragmenter operative to subdivide the video asset into a plurality of fragments, a first processor operative to identify, for at least one fragment among the plurality of fragments, a plurality of candidate fragments included in the reference database, a candidate fragment including at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment, a comparator comparing each one of the plurality of candidate fragments to the at least one fragment, and a second processor operative to identify, as a result of the comparing, matching assets, and to produce an indication suitable for post-processing, based, at least in part on the identifying, that matching assets have been identified.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated more fully from the following detailed description, taken in conjunction with the drawings in which: Fig. 1 is a simplified illustration of a graphical data characterization and detection system constructed and operative in accordance with a preferred embodiment of the present invention;
Fig. 2 is a simplified illustration of one preferred embodiment of the system of Fig. 1; Fig. 3 is a simplified drawing of a typical graphical frame comprising data to be characterized, within the system of Fig. 1;
Fig. 4 is a depiction of a preferred method of determining a color mass center for one of a plurality of color elements, according to the system of Fig. 1; and Figs. 5 - 8 are simplified flowcharts of preferred methods of operation of the system of Fig. 1.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
Reference is now made to Fig. 1, which is a simplified pictorial illustration of a graphical data characterization and detection system constructed and operative in accordance with a preferred embodiment of the present invention. The system of Fig. 1 comprises a data characterizer 10, a database 20, and a comparison module 30.
The operation of the system of Fig. 1 is now described. The data characterizer 10 receives an input comprising graphical data 40a. The graphical data 40a comprises at least one frame of graphical data. Those skilled in the art will appreciate that any data which may be expressed graphically may serve as input for the data characterizer 10. For example and without limiting the generality of the foregoing, the graphical data 40a may comprise video data, and thus comprise a series of frames, depicted as F^, F2, ... , Fn, .... Alternatively, the graphical data 40a may comprise a still digital photograph, and thus, the graphical data 40a comprises only a single frame, Fi . It is noted that although Fig. 1 depicts graphical data 40a (as well as the as yet unintroduced graphical data 40b and graphical data 40c) as comprising a plurality of frames, said depiction is not meant to be limiting, and the graphical data (in all of its various instances: 40a, 40b, and 40c) may, as mentioned previously, comprise only a single frame. Throughout the present disclosure and claims, the graphical data 40a, 40b is also referred to as an "asset".
The data characterizer 10 performs several operations on the graphical data 40a, as described below in detail, with reference to Fig. 4. As a result of the operations performed by the data characterizer 10 on the graphical data 40a, a second moments matrix characterization 50a is output by the data characterizer 10 to an appropriate storage unit, such as the database 20.
As described below, each graphical frame F^, ¥2, ■ ■ ■ , Fn, ... is associated with a characteristic second moments matrix. Each characteristic second moments matrix will have two real and non-negative eigenvalues, hereinafter denoted λγ and λ^ Typically, the second moments matrix characterization 50a for the graphical data 40a will be stored in the database 20
with a unique identifier 60a. Typically, a plurality of second moments matrix characterizations 50 associated with for the graphical data 40a are stored in the database 20 with each individual eigenvalue characterizations associated with one of a plurality of unique identifiers 60. The graphical data 40a remains unchanged as a result of the operations performed by the data characterizer 10 on the graphical data 40a. Nevertheless, for ease of depiction, the graphical data 40a, after the operations performed by the data characterizer 10 is depicted as graphical data 40b.
At some time, not necessarily the same time as the second moments matrix characterization 50a associated with the graphical data 40a is stored in the database 20, a suspect graphical data 40c is captured from some content sharing network 70, a website (not depicted), or some via other appropriate method of distributing data.
The suspect graphical data 40c is input into the data characterizer 10, in order to derive a second moments matrix characterization 50b associated with the suspect graphical data 40c. As described below with reference to Fig. 4, the comparison module 30 performs a comparison routine on second moments matrix characterization 50a and second moments matrix characterization 50b, in order to determine if the second moments matrix characterization 50a and second moments matrix characterization 50b are the same.
If the second moments matrix characterization 50a and second moments matrix characterization 50b are the same, then it is concluded that associated graphical data 40a and associated graphical data 40c are in fact different instantiations of a single unit of graphical data 40a, 40c. If, in fact, it is determined that graphical data 40a and graphical data 40c are different instantiations of a single unit of graphical data 40a, 40c (depicted in Fig. 1 as "Matches" 80), then the comparison module 30 produces an appropriate indication, suitable for post-processing, that the graphical data 40a and the graphical data 40c are different instantiations of a single unit of graphical data 40a, 40c.
It is appreciated that typical post processing actions include, but are not limited to identifying a source of the suspect graphical data 40c, and possibly thereinafter billing and enforcing copyright on the suspect graphical data 40c.
If, on the other hand, the comparison module 30 determines that the graphical data 40a and the graphical data 40c are not different instantiations of a single unit of graphical data 40a, 40c (depicted in Fig. 1 as "Does Not Match" 90), and no further action occurs.
Reference is now made to Fig. 2, which is a simplified illustration of one preferred embodiment of the system of Fig. 1. In the preferred embodiment depicted in Fig. 2, the second moments matrix characterization 50, 50a, 50b as depicted in Fig. 1 comprises a characterization performed as an eigenvalue characterization 250, 250a, 250b. Specifically, each graphical frame Fi , F^, ... ,
F , ... is associated with a characteristic second moments matrix. Each characteristic second moments matrix will have two real and non-negative eigenvalues, hereinafter denoted λγ and λ^ Typically, the eigenvalue characterization 250a for the graphical data 40a will be stored in the database 20 with a unique identifier 60a. Typically, a plurality of eigenvalue characterizations 250 associated with for the graphical data 40a are stored in the database 20 with each individual eigenvalue characterizations associated with one of a plurality of unique identifiers 60.
Likewise, the suspect graphical data 40c is input into the data characterizer 10, in order to derive an eigenvalue characterization 250b associated with the suspect graphical data 40c. As described below with reference to Fig. 4, the comparison module 30 performs a comparison routine on eigenvalue characterization 250a and eigenvalue characterization 250b, in order to determine if the eigenvalue characterization 250a and eigenvalue characterization 250b are the same.
For example and without limiting the generality of the foregoing, if the input suspect graphical data 40c comprises multiples frames, depicted as F^, F2, ..., Fn, .... then the comparison module 30 may be done by any appropriate
means, including, but not limited to analyzing a 2-norm distance (also known as "Euclidean distance") that is:
Specifically, considering Ax and Ay as vectors, thus, if Ax is defined as:
for a series A^ and A2, where A^ and A2 are series of eigenvalue characterization 250a and eigenvalue characterization 250b for a plurality of frames F^, ¥2, ..., Fn
..., in both the graphical data 40a and the suspect graphical data 40c. Similarly for Ay with a series B^ and E^. It is appreciated that in a case where an exact match is not found by the comparison module 30 between the second moments matrix characterization 50a and the second moments matrix characterization 50b, then the system of Fig. 1 may notify a human operator that the human operator should examine the suspect graphical data 40c with one or more of the closest-matching records in the database. The human operator may then perform a visual comparison to choose the matching database record, or to reject the match.
Reference is now made to Fig. 3, which is a simplified drawing of a typical graphical frame comprising data to be characterized, within the system of Fig. 1. Those skilled in the art will appreciate that every frame comprising data which is to be characterized comprises a plurality of pixels. Each of the plurality of pixels may be represented a comprising a tuple, the tuple representing a group of color elements in the pixel. For example and without limiting the generality of the foregoing, in a red, green, blue color system (hereinafter R,G,B, where R stands for red, G stands for green, and B stands for blue. Whether taken collectively or individually), each of the plurality of pixels may be represented as comprising a value between 0 and 255.
Those skilled in the art will appreciate that pixel color may alternatively be expressed in any appropriate color space, such as any of the well known Chrominance / Luminance systems (for instance, YCbCr; YPbPr; YDbDr),
or according to the xvYCC standard, IEC 61966-2-4. For simplicity of discussion, pixel color is expressed herein, in a non-limiting manner, as a RGB triplet.
Similarly, in some cases, the frame may comprise a black and white picture. In such a case, a pixel may be depicted as comprising only one color element in a gray scale, or any other appropriate color space used to characterize black and white pictures.
The following notation, of which certain portions are depicted, for illustrative purposes, in Fig. 3, is used in the discussion below, as well as in the claims: W frame width in pixels
H frame height in pixels p = (x, y) a pixel's position relative to center. E.g. the top-left pixel is (-W/2, -H/2).
R(p), G(p), B(p) pixel p's original red, green, blue component
Reference is now made to Fig. 4, which is a depiction of a preferred method of determining a color mass center for one of a plurality of color elements, according to the system of Fig. 1. A preferred method of calculating the color mass center is to determine values of X and Y, X and Y being coordinates of the color mass center for one of a plurality of color elements in the typical graphical frame of Fig. 3:
Summations are taken over all pixels p in a frame; x and y denote the coordinates of the pixels, relative to center; and m denotes color mass, i.e. the value of the color component at a pixel; e.g. for R, it denotes the value of the pixel's color component R.
(equation 2) Three such matrices may be determined, one for each color element, R, G, and B.
Eigenvalues may then be determined for matrix A, the eigenvalues being roots of matrix A's characteristic polynomial. Specifically, the eigenvalues of matrix A comprise solutions to the following quadratic equation in t: t2 -(a + d) t + (ad -bc) = 0 (equation 3)
The second-moments of color mass matrix, according to the above definition, is symmetric and positive-definite. Thus, there will be two real and non-negative eigenvalues of matrix A, denoted λi and λo.
Those skilled in the art will appreciate that dividing a matrix by a fixed value results in the matrix's eigenvalues being divided by the same fixed value. For example, let e = a + b + c + d. Then, if every element in matrix A is divided by e, in thereby producing a new matrix A':
(equation 4) Then, the eigenvalues λ'^ and λ^ of A' are, in fact, equal to the eigenvalues X1 and λ2 of A, divided by e: λ'j = λγ/e; and
Furthermore, as will be discussed below, in order to ensure that resulting values remain the same, regardless of any divisions which occur during processing, a ratio of X1 and λ2 may be used for detection purposes. Since both X1 and λ2 are divided by the same quantity, in the above example, e:
(equation 5)
Computing X1 and X2 for each color component and each frame in the video produces a series of 6-tuples of real values. To identify any video as a known asset, a similar series must be stored in the database for every known asset.
If an asset is captured, and it is desired to determine whether the asset is identical to another asset, a series 6-tuples from the captured asset can be directly compared to the plurality of second moment matrix characterizations 50
(Fig. 1) stored in the database 20. Comparison of 6-tuple series can be performed by the comparison module comparison module 30, using any appropriate method, for example and without limiting the generality of the foregoing, using accepted vector-distance metrics such 2-norm distance. Those skilled in the art will appreciate that any best match found can be verified manually by an operator.
In some cases, the 6-tuple series is unsynchronized (offset in time by an unknown shift), and thus, direct comparison will not work. However, it is possible to use string matching to find both the corresponding known asset and the time offset. Those skilled in the art will appreciate that for string matching to work, actual values comprising the 6-tuple series and a corresponding reference series must be rounded to a pre-determined scale, for example and without limiting the generality of the foregoing, to the nearest integer, or alternatively, to the nearest hundredth. It is appreciated that the corresponding reference series comprising the records 50 and 60 (Fig. 1), and 250 and 60 (Fig. 2) in the database
20.
A non-limiting example of determination of the second moments matrix, and the eigenvalues of the second moments matrix is now given.
Table 1 below is an exemplary 5 pixel x 5 pixel graphical frame. For each pixel, a red (R), green (G), and blue (B) color value is indicated.
TABLE 1
Accordingly, the pixel located at (x,y) coordinate (2,3) has RGB values of: 236, 100, and 128.
In the present example, only the red color component is considered. It should be apparent that only considering the red color component is a matter of convenience, and in reality, in the method of Fig. 1, all color components are considered. Table 2 is a repetition of the above table, however, only the red color component is repeated:
TABLE 2
The sum of all of the red components of all of the pixels, in Table 2, above, Sum(R), is equal to 3838.
The weighted value of each red color value, where the red color value is weighted by the x-coordinate of the pixel (i.e. R(x,y) * x), therefore, is:
TABLE 3
The sum of all of the weighted red components, in Table 3, above, divided by sum(R) is equal to 7228 / 3838 = 1.88. In the explanation of Fig. 4, above, this quotient, 1.88, as denoted X.
Likewise, the weighted values of each red color value, where the red color value is weighted by the y-coordinate of the pixel (i.e. R(x,y) * y), therefore, is:
TABLE 4
The sum of all of the weighted red components, in Table 4, above divided by sum(R) is equal to 7904 / 3838 = 2.06. In the explanation of Fig. 4, above, this quotient, 2.06, is denoted Y.
As mentioned above, the coordinates (X, Y) are the coordinates of the color mass center. In the case at hand, (1.88, 2.06) is the red color mass center of the five pixel by five pixel frame described in Table 1. Table 5, below, tabulates the red color value, as given in Table 1, weighted by (x - X) * (y - Y):
TABLE 5
The sum of all of the red color values weighted by (x - X) * (y - Y), i.e., the sum of all values in Table 5, is: 442.36. 442.36 divided by sum(R) is equal to 0.115258. This value, 0.115258 is equal to matrix elements b and c, in matrix A, for the present example.
Table 7, below, tabulates the red color value, as given in Table 1, weighted by (x - X)2. The values of (x - X) are those given across the top row of
Table 5. For convenience sake, the values of (x - X) and (x - X)2 are tabulated in Table 6.
TABLE 6
TABLE 7
The sum of all of the red color values weighted by (x - X) ^, i.e., the sum of all values in Table 7, is: 7952.89. 7952.89 divided by sum(R) is equal to 2.07. This value, 2.07 is equal to matrix element a in matrix A, for the present example.
Table 9, below, tabulates the red color value, as given in Table 1, weighted by (y - Y)2. The values of (y - Y) are those given down the leftmost
column of Table 5. For convenience sake, the values of (y - Y) and (y - Y)2 are tabulated in Table 8.
TABLE 8
TABLE 9
The sum of all of the red color values weighted by (y - Y) % i.e., the sum of all values in Table 9, is: 7363.08. 7363.08 divided by sum(R) is equal to 1.92. This value, 1.92 is equal to matrix element d in matrix A, for the present example.
Compiling the values determined for a, b, c, and d into a single matrix A, gives:
2.07 0.12
A = (equation 6)
0.12 1.92
Thus, the matrix trace of matrix A = a + d = 3.99.
The matrix determinant of matrix A = (a * d) - (b * c) = 3.96.
Inserting the values of the matrix trace and the determinant into quadratic equation 3:
9 t - (a + d) t + (ad - bc) = 0 (equation 3) gives: t2 - 3.99 t + 3.96 = 0 (equation 7) Solving for the roots of equation 7 gives: λχ = 1.857
X2 = 2.1332
It is the opinion of the inventors of the present invention that the data characterization system described herein is highly resistant to known attacks, and is particularly robust to source video transformations. Specifically:
Color Balance and Similar Transformations - A change in the saturation of different color components in the source graphic, for instance increasing the red component and decreasing the blue, will effectively multiply both the Xγ and ^ values for the modified color components by some factor. It is appreciated that the factor will remain the same for each component. It is therefore preferable to store the ratio XγlX^ Resizing (stretching) - Stretching the source graphic will cause a linear change in the values of λi and λo. As with a color balance or transformation attack, it is therefore preferable to store the ratio λi IX^ rather than λj and X^
Change of video aspect - a non- isomorphic stretch of the video, comprising stretching the horizontal and vertical directions differently, would cause a distortion in the values of Xx and X2, defeating the characterization technique described herein. However, video aspect ratios are standard, and it is easy to restore the original aspect before 6-tuple extraction. Specifically, it is assumed that content characterized in the database comprises content in standard aspect ratios, that is either an aspect ratio of 3:4 or an aspect ratio of 9:16, as is well known in the art. In the event that a cropped content item or a stretched content item is captured, the captured content item can be forced into a standard aspect ratio using techniques well known in the art, in order to increase chances of finding a match. Rotation - The characterization technique described herein is resilient to rotation, because applying rotation to the picture does not change the color of the pixels in the picture, therefore, matrix A's eigenvalues remain unchanged. The unchanging nature of the eigenvalues can be mathematically proven using the properties of bilinear forms, to which the matrix belongs. Specifically, rotating a graphical frame is equivalent to applying a rotation matrix to the left of the second moments matrix, that is to say, multiplying the second
moments matrix by another matrix. As is well known, multiplication by a rotation matrix does not alter the eigenvalues of a subject matrix.
Cropping - Cropping the source graphic may cause a loss of color information, resulting in a change of the values of X1 and X2 in the encoded data. The loss of information will be proportional to the drop in perceived video quality and a loss of similarity to the original.
Collusion attacks - are not applicable to the data characterization system and method described herein.
In an alternative preferred embodiment of the present invention, for a given sequence of video frames, it is possible to find fragments of the given sequence in a reference database of video materials.
A video clip undergoing analysis is subdivided into a plurality of small fragments. For each fragment, a number of candidate fragments in the database are identified as possibly matching any one of the plurality of small fragments. In a second stage, every identified candidate fragment is compared to every fragment. All fragments found to be matching are considered to be substantially identical to a matching fragment.
Identification of candidate fragments is performed as follows. If
Z)(t) is a discrete signal and φ(t) is a wavelet of a mother wavelet, then Discreet Wave Analysis (DWA) can be used to find occurrences of φ(t) in D(t). The result of DWA is a triplet {τ, c, A}, where τ is a beginning of an interval, and c and A are constants, so that D(τ -\- t) = AΦ(ct) for a small value of t. It is therefore appreciated that DWA is utilized to determine whether and where a particular wavelet occurs in a given signal. For every video clip analyzed, DWA is performed with each wavelet in the video database. It is appreciated that the constant A will be large if and only if a particular occurrences of φ(t) occurs in a particular interval of Z)(t). All triplets where A is not large are discarded during analysis. Remaining triplets {xp Cp Aj- } comprise a fingerprint of the video clip analyzed. Candidate fragments are then compared to every fragment as follows:
Let X(i) and Y(i) be one-dimensional discrete video signals. Allowing for an offset, denoted a, a step, denoted s, and a duration, denoted T, the step s comprising an integer, determining a correlation step, such that when 5=1, every frame is checked; when 5>1, only 1 out of s frames are checked, then:
N-i
∑ ^is^α+is ~ X ' Y
Ca s T (X, Y) = — (equation 8) σxσv where:
T
_ i N-i
1 N-i _ 9
It is appreciated that Ca s j (X, F) ≤ 1
Furthermore, the correlation function, C, is invariant to affine transformations with positive coefficients. Specifically, if X' = aX + β and
T = γY + δ , where a,γ > 0, then C(X', Y') = C(X, Y) .
Additionally, if X and Y are independent random variables, then C = 0. Thus, if there is no relation between X and Y, C is expected to be close to 0.
Moreover, if X is derived from Y using a shift by some constant time, a, that is Xt = Ya+p then Ca g j(X, Y) = 1. As noted above, the correlation function, C, withstands affine transformations. Similarly, the opposite is also true: if Ca j(X, Y) = I, then X can be derived from 7by affine transformation or shift.
Still furthermore, small changes on a and s have little or no effect on C.
An average value of Luma, denoted Y in the YUV model, is determined over an entire video frame. Those skilled in the art will appreciate that changes in brightness or contrast affect Y as an affine transformation. Y also withstands screen rotations. Thus, vectors of Y can be compared using the correlation function C, of equation 8. Those skilled in the art will appreciate that alternatively, vectors of either chrominance component, U or V, can also be compared using the correlation function C, of equation 8. Alternatively, any other appropriate well known Chrominance / Luminance systems mentioned above are also subject to comparison using the correlation function C, of equation 8.
Similarly, the values of λi and λ^ can also be compared using the correlation function C, of equation 8.
Likewise, denoting the ratio λi
the values of Zγ and ∑2 can also be compared using the correlation function C, of equation 8.
Additionally, if one eigenvector is chosen corresponding to the value of λ^ such that an angle φ formed between the horizontal axis Ox, and the chosen vector is less than π. φ is then invariant to shifts and brightness and color tuning, but not to rotations. It is appreciated that since φ is invariant to shifts and brightness and color tuning, but not to rotations, (φ/π)*256 is also invariant to shifts and brightness and color tuning, but not to rotations. Vectors of φ are likewise comparable to one another using the correlation function C, of equation 8.
In certain embodiments of the present invention, for convenience, it may be preferable to use (φ/π)*256 as input to the correlation function C, of equation 8, rather than using φ as input to the correlation function C.
Reference is now made to Figs. 5 - 8, which are simplified flowcharts of preferred methods of operation of the system of Fig. 1. Figs. 5 - 8 are believed to be self-explanatory in light of the above discussion. It is appreciated that software components of the present invention may, if desired, be implemented in ROM (read only memory) form. The software
components may, generally, be implemented in hardware, if desired, using conventional techniques.
It is appreciated that various features of the invention which are, for clarity, described in the contexts of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided separately or in any suitable subcombination.
It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined only by the claims which follow:
Claims
1. A method for matching assets, the method comprising: providing a first asset, the first asset comprising at least one frame; providing a second asset, the second asset comprising at least one frame; locating a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color; locating a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color; comparing the first at least one color mass center and the second at least one color mass center; and determining, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, producing an indication suitable for post-processing, based, at least in part on the determining, that the second asset matches the first asset.
2. The method according to claim 1 and wherein the comparing comprises: calculating a second moments matrix for the first at least one color mass center in the first asset; calculating a second moments matrix for the second at least one color mass center in the second asset; and comparing the second moments matrix for the first asset and the second moments matrix for the second asset.
3. The method according to claim 2 and wherein the comparing further comprises: calculating a first eigenvalue, denoted λγ, and a second eigenvalue, denoted ^^ f°r tne second moments matrix of the first asset; calculating a third eigenvalue, denoted λ^, and a fourth eigenvalue, denoted λφ for the second moments matrix of the second asset; comparing the eigenvalues, λi and λo, of the second moments matrix of the first asset with the eigenvalues, λo and λφ of the second moments matrix of the second asset and producing a result.
4. The method according to any of claims 1 - 3 and wherein the providing the second asset comprises capturing the second asset.
5. The method according to any of claims 3 - 4 and wherein, for each color mass center for each color in each frame of a plurality of frames in the first asset, λj and ^ are stored.
6. The method according to claim 5 and wherein λi and λo are stored in a database.
7. The method according to any of claims 3 - 6 and wherein the comparing comprises comparing at least λi /λo with λo/λφ
8. The method according to any of claims 1 - 7 and wherein the first asset comprises a still picture.
9. The method according to any of claims 1 - 7 and wherein the first asset comprises a video.
10. The method according to any of claims 1 - 9 and wherein the second asset comprises a still picture.
11. The method according to any of claims 1 - 9, and wherein the second asset comprises a video.
12. The method of any of claims 1 - 11, and wherein the locating the first at least one color mass center for at least one color is performed by performing the following steps: for at least one frame in the first asset, summing all color values for the at least one color for each pixel in the at least one frame, the sum denoted S; for at least one frame in the first asset, weighting each color value for each pixel in the frame by a pixel-associated x-coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,y) * x, representing a weighted value is determined, the value denoted
J(χ,y); determining an x-coordinate center of mass, denoted X, by dividing J(x,y) by S; for the at least one frame, weighting each color value for each pixel in the at least one frame by a pixel-associated y-coordinate, such that for each pixel in the at least one frame and for the at least one color, an associated value R(x,y) * y, representing a weighted value is determined, the value denoted K(x,y); and determining an y-coordinate center of mass, denoted Y, by dividing K(x,y) by S.
13. The method of claim 12 and further comprising: for the at least one frame and for the at least one color, tabulating, for each color value of each pixel, a table of weighted pixel values, such that the sum of resultant weighted color values divided by S are denoted:
15. The method of claim 14 and further comprising: determining the eigenvalues λ^ and ^ °f tne matrix A for each at least one color.
16. The method according to claim 15 and wherein the comparing the second moments of the first asset with the second moments matrix of the second asset comprises comparing λ^ and ^ f°r eacn at least one color.
17. The method according to any of claims 1 - 16 and wherein the at least one color comprises a Red- Green-Blue color element.
18. The method according to any of claims 1 - 16 and wherein the at least one color comprises a chrominance / luminance color element.
19. The method according to claim 18 and wherein the chrominance / luminance color element comprises a YCbCr chrominance / luminance color element.
20. The method according to claim 18 and wherein the chrominance / luminance color element comprises a YPbPr chrominance / luminance color element.
21. The method according to claim 18 and wherein the chrominance / luminance color element comprises a YDbDr chrominance / luminance color element.
22. The method according to claim 18 and wherein the chrominance / luminance color element comprises a xvYCC chrominance / luminance color element.
23. The method according to any of claims 1 - 16 and wherein the at least one color comprises a gray scale value.
24. The method according to any of claims 1 - 23 and wherein the post- processing comprises identifying a source of the second asset.
25. The method according to claim 24 and further comprising billing the source of the second asset.
26. The method according to claim 24 and further comprising enforcing copyright on the second asset.
27. A method for matching assets, the method comprising: providing a reference database, the reference database comprising a plurality of video clips; providing a video asset, the video asset comprising a plurality of video frames; subdividing the video asset into a plurality of fragments; for at least one fragment among the plurality of fragments, identifying a plurality of candidate fragments comprised in the reference database, a candidate fragment comprising at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment; comparing each one of the plurality of candidate fragments to the at least one fragment; and identifying, as a result of the comparing, matching assets, and producing an indication suitable for post-processing, based, at least in part on the identifying, that matching assets have been identified.
28. The method according to claim 27, and wherein the providing a video asset comprises capturing a video asset.
29. The method according to either claim 27 or claim 28, and wherein the identifying a plurality of candidate fragments comprises performing Discreet Wave Analysis (DWA) .
30. The method according to claim 29, and wherein the DWA is performed for each at least one fragment among the plurality of fragments, and each wavelet in the reference database is analyzed.
31. The method according to claim 30 and wherein: each fragment is represented by a discrete signal denoted D(t); each candidate fragment is wavelet of a mother wavelet, denoted
Φ(t); τ denotes a beginning of an interval; c and A denote constants; then, for a time interval, denoted t, D(τ + t) = AΦ(ct) a candidate fragment is identified.
32. The method according to claim 31 and wherein t ranges from 0.5 seconds to 3 seconds.
33. The method according to either claim 31 or claim 32 and wherein A is large.
34. The method according to any of claims 27 - 31, and wherein the comparing comprises comparing using a correlation function, denoted C.
35. The method according to claim 34 and further comprising: representing candidate fragment as X(i), and a fragment as Y(i), X(i) and Y(i) each comprising a one-dimensional discrete video signal; and providing: an offset, denoted a; a step, denoted s, the step s comprising an integer, determining a correlation step, such that when 5=1, every frame is checked and when 5>1, only 1 out of s frames are checked; a duration, denoted T;
T
_ i N-i
N z=0
N-i
36. The method according to claim 35 and wherein Ca s j (X, Y) ≤ 1 .
37. The method according to either claim 35 or claim 36 and wherein C, is invariant to affϊne transformations with positive coefficients.
38. The method according to any of claims 35 - 37 and wherein candidate fragment X(i) and the fragment Y(i) do not match if C = 0.
39. The method according to any of claims 35 - 38 and wherein candidate fragment X(i) and the fragment Y(i) do match if C =1.
40. The method according to any of claims 35 - 39 and wherein the representing candidate fragment as X(i), comprises determining a vector comprising average Luma (Y) for each frame, i, in candidate fragment X.
41. The method according to any of claims 35 - 40 and wherein the representing fragment as Y(i), comprises determining a vector comprising average Luma (Y) for each frame, i, in fragment Y.
42. The method according to any of claims 35 - 39 and wherein the representing candidate fragment as X(i), comprises determining a vector comprising an average chrominance component for each frame, i, in candidate fragment X.
43. The method according to any of claims 35 - 39 or 42 and wherein the representing fragment as Y(i), comprises determining a vector comprising average chrominance component for each frame, i, in fragment Y.
44. The method according to either of claims 42 or 43 and wherein the chrominance component comprises a CbCr chrominance color element.
45. The method according to either of claims 42 or 43 and wherein the chrominance component comprises a PbPr chrominance color element.
46. The method according to either of claims 42 or 43 and wherein the chrominance component comprises a DbDr chrominance color element.
47. The method according to either of claims 42 or 43 and wherein the chrominance component comprises a xvYCC chrominance color element.
48. The method according to any of claims 35 - 39 and further comprising: calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λi and λ2, wherein the representing candidate fragment as X(i), comprises determining a vector comprising a plurality of eigenvalues λi and λo.
49. The method according to any of claims 35 - 39 or 48 and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λ^ and X4, wherein the representing candidate fragment as Y(i), comprises determining a vector comprising an a plurality of eigenvalues λo and XA -
50. The method according to any of claims 35 - 39 and further comprising: calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λi and
wherein the representing candidate fragment as X(i), comprises determining a vector comprising a plurality of ratios of eigenvalues λγ and ^i
51. The method according to any of claims 35 - 39 or 50 and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λo and
wherein the representing candidate fragment as Y(i), comprises determining a vector comprising a plurality of ratios of eigenvalues λ^ and λφ
52. The method according to any of claims 35 - 39 and further comprising: calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment X; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment X; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λγ and
wherein the representing candidate fragment as X(i), comprises choosing an angle φ between a horizontal axis, Ox, and one eigenvector corresponding to λγ and determining a vector comprising a plurality of angles φ.
53. The method according to any of claims 35 - 39 or 52 and calculating a color mass center for one of a plurality of color elements for each frame, i, in candidate fragment Y; deriving, based on the color mass center of each frame, i, in candidate fragment X, a second moments matrix for each frame, i, in candidate fragment Y; and evaluating each second moments matrix for each frame, i, in candidate fragment X, the evaluating comprising determining eigenvalues λo and
wherein the representing candidate fragment as Y(i), comprises choosing an angle φ between a horizontal axis, Ox, and one eigenvector corresponding to λo and determining a vector comprising a plurality of angles φ.
54. The method according to any of claims 27 - 53 and wherein the post-processing comprises identifying a source of at least one of the matching assets.
55. The method according to claim 54 and further comprising billing the source of one of the matching assets.
56. The method according to claim 54 and further comprising enforcing copyright on at least one of the matching assets.
57. A system for matching assets, the system comprising: a first asset comprising at least one frame; a second asset comprising at least one frame; a color mass center locator operative to locate: a first at least one color mass center, the first at least one color mass center being a color mass center in the first asset for at least one color; and a second at least one color mass center, the second at least one color mass center being a color mass center in the second asset for said at least one color; a comparator, the comparator comparing the first at least one color mass center and the second at least one color mass center; and a processor operative to determine, based, at least in part, on a result of the comparing, whether the first asset and the second asset represent different instantiations of a single asset and, if so, to produce an indication suitable for postprocessing, based, at least in part on the determining, that the second asset matches the first asset.
58. A system for matching assets, the system comprising: a reference database, the reference database comprising a plurality of video clips; a video asset, the video asset comprising a plurality of video frames; a video fragmenter operative to subdivide the video asset into a plurality of fragments; a first processor operative to identify, for at least one fragment among the plurality of fragments, a plurality of candidate fragments comprised in the reference database, a candidate fragment comprising at least a portion of one of the plurality of video clips, the plurality of candidate fragments being potential matches to the least one fragment; a comparator comparing each one of the plurality of candidate fragments to the at least one fragment; and a second processor operative to identify, as a result of the comparing, matching assets, and to produce an indication suitable for postprocessing, based, at least in part on the identifying, that matching assets have been identified.
Respectfully submitted,
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IB2007/053660 WO2009034419A1 (en) | 2007-09-11 | 2007-09-11 | System and method for characterizing data |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IB2007/053660 WO2009034419A1 (en) | 2007-09-11 | 2007-09-11 | System and method for characterizing data |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009034419A1 true WO2009034419A1 (en) | 2009-03-19 |
Family
ID=39410486
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2007/053660 Ceased WO2009034419A1 (en) | 2007-09-11 | 2007-09-11 | System and method for characterizing data |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2009034419A1 (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6625311B1 (en) * | 1999-04-09 | 2003-09-23 | The Board Of Regents Of The University Of Nebraska | Methodology for data structure characterization, indexing, storage and retrieval |
-
2007
- 2007-09-11 WO PCT/IB2007/053660 patent/WO2009034419A1/en not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6625311B1 (en) * | 1999-04-09 | 2003-09-23 | The Board Of Regents Of The University Of Nebraska | Methodology for data structure characterization, indexing, storage and retrieval |
Non-Patent Citations (7)
| Title |
|---|
| A.PAPOULIS: "Probability, random variables and stochastic processes", 1986, MCGRAW-HILL, HAMBURG, LONDON, PARIS, XP002491822 * |
| CHANG E Y ET AL: "RIME: a replicated image detector for the World Wide Web", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 3527, 1998, pages 58 - 67, XP002491820, ISSN: 0277-786X * |
| DEL BIMBO A ET AL: "3D wavelet based video retrieval", ADVANCES IN PATTERN RECOGNITION - ICAPR 2001. SECOND INTERNATIONAL CONFERENCE. PROCEEDINGS (LECTURE NOTES IN COMPUTER SCIENCE VOL.2013) SPRINGER-VERLAG BERLIN, GERMANY, 2001, pages 349 - 358, XP002491819, ISBN: 3-540-41767-2 * |
| HYUNHO KANG ET AL: "Video Fingerprinting System Using Wavelet and Error Correcting Code", INFORMATION SECURITY APPLICATIONS LECTURE NOTES IN COMPUTER SCIENCE;;LNCS, SPRINGER-VERLAG, BE, vol. 3786, 1 January 2006 (2006-01-01), pages 150 - 164, XP019026727, ISBN: 978-3-540-31012-9 * |
| R MUKUNDAN, K R RAMAKRISHNAN: "Moment Functions in Image Analysis", 1998, WORLD SCIENTIFIC, SINGAPORE, LONDON, XP002481936 * |
| TRUCHETET F ET AL: "Wavelets in industrial applications: a review", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 5607, no. 1, 2004, pages 1 - 14, XP002491821, ISSN: 0277-786X * |
| ZHENYAN LI ET AL: "Content-Based Video Copy Detection with Video Signature", CIRCUITS AND SYSTEMS, 2006. ISCAS 2006. PROCEEDINGS. 2006 IEEE INTERNA TIONAL SYMPOSIUM ON KOS, GREECE 21-24 MAY 2006, PISCATAWAY, NJ, USA,IEEE, 21 May 2006 (2006-05-21), pages 4321 - 4324, XP010939649, ISBN: 978-0-7803-9389-9 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Jiang et al. | Single image super-resolution quality assessment: a real-world dataset, subjective studies, and an objective metric | |
| Madhusudana et al. | Subjective and objective quality assessment of stitched images for virtual reality | |
| Fang et al. | No reference quality assessment for screen content images with both local and global feature representation | |
| Boulkenafet et al. | Face antispoofing using speeded-up robust features and fisher vector encoding | |
| Sun et al. | MDID: A multiply distorted image database for image quality assessment | |
| Fang et al. | Saliency detection in the compressed domain for adaptive image retargeting | |
| Güera et al. | A counter-forensic method for CNN-based camera model identification | |
| Hadizadeh et al. | Full-reference objective quality assessment of tone-mapped images | |
| JP2010527556A (en) | Deriving video signatures insensitive to image modification and frame rate conversion | |
| Jose et al. | A novel method for color face recognition using KNN classifier | |
| Liu et al. | Locating splicing forgery by adaptive-SVD noise estimation and vicinity noise descriptor | |
| Peng et al. | CCoLBP: Chromatic co-occurrence of local binary pattern for face presentation attack detection | |
| Ardizzone et al. | A tool to support the creation of datasets of tampered videos | |
| Sharma et al. | A review of passive forensic techniques for detection of copy-move attacks on digital videos | |
| Li et al. | Mimt: multi-illuminant color constancy via multi-task local surface and light color learning | |
| Gupta et al. | Video authentication in digital forensic | |
| Han et al. | Privacy-preserving face recognition in hybrid frequency-color domain | |
| D'Angelo et al. | A full-reference quality metric for geometrically distorted images | |
| Oprea et al. | Perceptual video quality assessment based on salient region detection | |
| Bhattarai et al. | Puzzling face verification algorithms for privacy protection | |
| WO2009034419A1 (en) | System and method for characterizing data | |
| Yang et al. | Subjective quality evaluation of compressed digital compound images | |
| Eerola et al. | Full reference printed image quality: Measurement framework and statistical evaluation | |
| Kumar et al. | Texture feature extraction to colorize gray images | |
| Lee et al. | Wide color gamut image content characterization: method, evaluation, and applications |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07826342 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07826342 Country of ref document: EP Kind code of ref document: A1 |