[go: up one dir, main page]

WO2005048133A1 - Procede de traitement d'image, systeme de traitement d'image et programme de traitement d'image - Google Patents

Procede de traitement d'image, systeme de traitement d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2005048133A1
WO2005048133A1 PCT/JP2004/016946 JP2004016946W WO2005048133A1 WO 2005048133 A1 WO2005048133 A1 WO 2005048133A1 JP 2004016946 W JP2004016946 W JP 2004016946W WO 2005048133 A1 WO2005048133 A1 WO 2005048133A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
representative color
image processing
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2004/016946
Other languages
English (en)
Japanese (ja)
Inventor
Johji Tajima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Publication of WO2005048133A1 publication Critical patent/WO2005048133A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user

Definitions

  • Image processing method image processing system, and image processing program
  • the present invention relates to an image processing method, an image processing system, and an image processing program.
  • a search for a similar image such as a color arrangement of an entire image is performed.
  • a technique in which an image is divided into a number of non-overlapping rectangular blocks, a plurality of dominant colors are extracted from each block, and an image is searched based on the extracted colors for example, Japanese Patent Application Laid-Open No. 2000-348179 (Reference 1)). )
  • a case is considered in which a flower is shown in an acquired image, and a plurality of kinds of flowers registered in a search database are searched for similar images based on the shape and color of the petal.
  • the color and texture characteristics of the petals in both the acquired image and the database image are compared.
  • a normal image contains the leaves and branches around the flower, or the background, so comparing the entire image will not make a correct search. For this reason, it is indispensable to perform a process of distinguishing a flower portion in an image from other leaves and a background portion, and cutting out a flower portion which is an object.
  • An object of the present invention is to reduce the burden on a user when cutting out an object in an image.
  • an image processing method includes a step of acquiring an image of an object, dividing the acquired image into a plurality of areas based on the arrangement of colors, and A step of creating a representative color image in which the color is replaced with the representative color of the area; a step of displaying the created representative color image; and a step of setting a designated area in the displayed representative color image as an object area. It is characterized by having.
  • the image processing system of the present invention includes an image acquisition unit that acquires an image of an object, divides the acquired image into a plurality of regions based on a color arrangement, and assigns a color of each region to the region.
  • a representative color image creating means for creating a representative color image replaced with the representative color of the above, an image display means for displaying the created representative color image, and an area of! / ⁇ offset of the displayed representative color image. It is characterized by comprising an area specifying means, and an object area registering means for setting the area specified by the area specifying means as an object area.
  • the image processing program further includes a step of acquiring an image of the object, dividing the acquired image into a plurality of regions based on the arrangement of colors, and changing the color of each region to a representative color of the region.
  • the computer is caused to execute a step of creating a replaced representative color image, a step of displaying the created representative color image, and a step of setting a designated area in the displayed representative color image as a target area.
  • a captured image is divided into a plurality of regions, a representative color image is created by replacing the color of each region with a representative color, and the representative color image is displayed.
  • FIGS. 1A to 1C are diagrams showing an example of an image explaining extraction of a representative color from an acquired image and determination of an object area.
  • FIG. 2 is a block diagram showing a configuration of Embodiment 1 of the present invention.
  • FIG. 3 is a flowchart showing an example of an operation flow of Embodiment 1 of the present invention.
  • FIG. 4 is an explanatory diagram of a representative color extraction method.
  • FIG. 5 is a block diagram showing functions of a monitor.
  • FIG. 6 is a block diagram showing an example of a function of an object area registration unit.
  • FIG. 7 is a flowchart showing another example of the operation flow of Embodiment 1 of the present invention.
  • FIG. 8 is a block diagram showing another example of the function of the object area registration means.
  • FIG. 9 is a block diagram showing a configuration example of an object searching means.
  • FIG. 10 is a block diagram showing a configuration example of a feature extraction unit.
  • FIG. 11 is an explanatory diagram of a texture feature calculation unit.
  • FIG. 12 is a block diagram showing a configuration of Embodiment 2 of the present invention.
  • FIG. 13 is a block diagram showing a configuration of Embodiment 3 of the present invention.
  • FIG. 1A is an original acquired image. Although the original image is shown here as a grayscale image, it is actually a color image with 8-bit information in each of the R, G, and B bands.
  • the flower portion should be extracted.
  • the representative color image is represented by a grayscale image with a small number of gradations, but in practice, a portion represented by one gradation is represented by one representative color.
  • These four regions 21 to 24 correspond to the flower portion, and the other representative color regions correspond to the background / leaf regions. Therefore, if the four regions 21 to 24 are extracted, most of the flower portion, which is the search target region, can be cut out. It is not convenient for the user to point the four areas 21 to 24 with a pointing device while viewing the representative color image. It is often an acceptable effort for the user. If the target area is cut out in this way, it is highly likely that the desired LV and the search result will be obtained by searching for an image similar to the database for that part.
  • the present invention has realized such a basic concept.
  • an image of an object is captured by the image acquisition means 1 such as a digital camera (step Sl in FIG. 3).
  • This image is called acquired image 2.
  • the acquired image 2 normally has 8 bits of information for each of the red (R), green (G), and blue (B) bands of an image sensor such as a camera. Color possibilities exist.
  • the representative color image creating means 3 performs the representative color extraction processing described in, for example, Reference 2 on the obtained acquired image 2 and generates a small number of representative colors, for example, about 10 colors, from the colors included in the obtained image 2. Extract and create representative color image 4 (step S2 in Fig. 3).
  • FIG. Figure 4 shows the color distribution of an image, usually represented in three dimensions, in two dimensions for simplicity. More specifically, the color space is shown in two dimensions of R and G instead of three dimensions of R, G and B! /.
  • the (R, G) value of each pixel of the image is plotted in this color space.
  • the component force clusters 111, 112, 113, and 114, which have a high distribution frequency, are represented by the Riko pattern.
  • the representative color image creating means 3 divides this color space in succession by dividing lines 121, 122, 123, and calculates the average colors 131, 132, 133, 134 of the colors present in each of the divided areas.
  • the representative color of the area can be repeated as many times as possible.
  • a threshold value is set for the color difference between two representative colors generated as a result of the division, and the division is stopped when the color difference falls below a certain threshold value. By setting the threshold value, a small number of representative colors, such as about 10 colors, can be extracted as described above.
  • a representative color image 4 is obtained by determining a region including each pixel value of the acquired image 2 and replacing each pixel value with a representative color of the region.
  • the representative color image 4 created in this way is displayed on the monitor 5 (image display means 51 in FIG. 5) together with the acquired image 2 (step S3 in FIG. 3).
  • the user observes the acquired image 2 and the representative color image 4 that are displayed.
  • the user designates the position (x, y) of the representative color area included in the petal of the representative color image 4 by pointing means (area specifying means) 6 such as a mouse (see FIG. 3).
  • Step S4a the four representative color areas 21 to 24 correspond to the petals, so the user specifies some pixels belonging to each of these four areas 21 to 24. For example, suppose that one pixel is designated from each of the four regions 21 to 24, and the pixel positions are (xl, yl), (x2, y2), (x3, y3), (x4, y4).
  • the object region registration means 7 creates, for example, an object region image 8 having a value of “1” for the pixel of the object to be extracted and a value of “0” for the other pixels (step S5a in FIG. 3). ).
  • the object area image 8 is created as follows.
  • the area composed of the detected pixels is set as an object area by the object area determining means 73 in FIG. 6, and '1' is written in the corresponding pixel area of the object area image 8, and the other pixel areas are written.
  • Writes '0' (step S53 in FIG. 3). This process was performed for the example in Fig. IB, and Fig. 1C shows the object region image 8 in which the "1" part was gray and the "0" part was black.
  • the target object region may include a region of a pixel within a predetermined range that is not limited to only a pixel whose pixel value completely matches the representative color detected by the representative color detection unit 71.
  • the object retrieving means 9 retrieves the features of the acquired image 2 of only the object area having the pixel value of the object area image 8 of "1", for example, the features such as color and texture, into various registered
  • the image is compared with the feature of the object and an image search is executed (step S6 in FIG. 3). Finally, the search result is displayed on the monitor 5 (search result display means 52 in FIG. 5) (step S7 in FIG. 3).
  • the user indicates the representative color area corresponding to the object by the pointing means 6
  • the user can easily indicate the representative color area.
  • the extracted representative color is displayed along with the number on the edge of the screen of the monitor 5, and the number of the corresponding representative color is displayed while the user looks at the color of the area where the object is considered to be on the screen. May be entered with the key. Even in this case, if the representative colors are about 10 colors, the user can easily specify the number of the required representative color area. Further, the same function can be realized even when the system implementing the present invention does not have the pointing means 6.
  • the object area image 8 is created as follows (Step S5b in FIG. 7). First, the pixel whose pixel value matches one of the designated representative colors is detected by the pixel detecting means 76 of FIG. 8 (Step S56 of FIG. 7). Then, the target region determining means 77 in FIG. 8 creates a target region image 8 by using the region composed of the detected pixels as a target region (step S57 in FIG. 7).
  • the configuration of the object searching means 9 varies depending on the application, but an example of the configuration of the above-described flower search will be described with reference to FIG.
  • the object searching means 9 includes a feature extracting means 91, a feature storing means 92, and a feature comparing means 93.
  • the feature extracting means 91 extracts an image feature of a pixel whose pixel value is “1” in the object area image 8 and is set as an object area among all pixels of the acquired image 2 (step S61 in FIG. 3).
  • the feature storage unit 92 stores image features for each type of flower (object) extracted in the same manner when learning the object. For example, image features of at least one flower (object) such as "chrysanthemum” and "camellia" are stored.
  • the image features stored in the feature storage means 92 are referred to as reference data.
  • the feature comparison means 93 compares the image features obtained by the feature extraction means 91 with the reference data stored in the feature storage means 92, and finds the closest flower (object) in the reference data.
  • the search result is output (step S62 in Fig. 3).
  • an optimum one may be selected according to the application.
  • the acquired image 2 is first enlarged and reduced by the size regular
  • the flower part is regularized so that the size of the flower part becomes a predetermined size.
  • the horizontal size can be determined by examining the coordinates of the contour of the target object region image 8 and examining its limit value.
  • the brightness normalizing means 912 normalizes the brightness of the flower portion at the time of shooting. For example, if the pixel of the object area image 8 of the image is '1', the (R, G, B) value of the acquired image 2 is checked, and the maximum value is RGBmax. Normalize each (R, G, B) value with the maximum value RGBmax.
  • the image subjected to the two normalizations is referred to as a normalized image 913.
  • the values of R ', G', and B ' are represented by 8-bit integers.
  • the cumulative color histogram calculation means 914 calculates a cumulative histogram for each of R ′, G ′, B ′.
  • the cumulative histogram for R '! (R ⁇ ) is the total number of pixels in which the pixel value is “1” in the object area image 8 and the value of R ′ in the normalized image 913 is less than or equal to R ′′.
  • the cumulative histograms f (G ”) and f (B”) are also calculated.
  • the texture feature calculation means 915 performs a discrete Fourier transform on the image of lightness Y 'calculated from R', G ', B', for example, as in equation (2), Calculate power.
  • FIG. 11 schematically shows this wave number plane.
  • positions indicated by solid circles indicate the respective wave numbers (kx, ky) for calculating the power.
  • the obtained power is divided into a plurality of regions 9152 and 9153 concentrically around a DC component 9151.
  • the powers of the wave numbers included in the respective regions are added, an absolute value k of the wave number is obtained by Expression (3), and the power p (k) relating to the wave number k is set as the texture feature amount. This is because the variation in the imaging direction is absorbed because the flower imaging direction cannot be constant with respect to the flower direction.
  • the feature comparing unit 93 compares the four features calculated by the feature extracting unit 91 from the acquired image 2 of the flower, which is the object, with reference data stored in the feature storing unit 92.
  • the above four feature quantities for the reference data "chrysanthemum” are represented by f (R "), f (G"), f rl gl bl
  • the means 92 stores reference data that also has characteristic features regarding N kinds of flowers, the name of the flower with the smallest value of D is retrieved as the kind of flower of the acquired image 2.
  • all kinds of flowers in which the value of D is smaller than a predetermined threshold value may be used as the search result.
  • the process performed by the feature comparison unit 93 is a conventionally known pattern recognition process in a feature space.
  • the use of the above-mentioned color cumulative histogram and Fourier power as features for the search, and the use of the Euclidean distance to determine the search results as shown in equation (4), are among the implementation methods that are selected first. Therefore, it is shown as an example. However, it is possible to enhance the search by using other features or by using more complex distances such as Mahalanobis distances instead of Euclidean distances.
  • a representative color extraction process is performed on the acquired image 2 to create a representative color image 4, and the representative color image 4 is presented to the user.
  • the user can easily specify the target portion in the acquired image 2. Therefore, it is possible to cut out the part of the target object in the acquired image 2 while reducing the burden on the user.
  • image features are calculated for the part of the object specified by the user, and the features are compared. By comparing, an object is searched. As a result, it becomes possible to practically search for an object in an image, which was difficult to cut out because it was conventionally difficult to cut out the object.
  • image acquisition is performed by a portable camera system and a system capable of communicating, or a system such as a mobile phone with a camera, and image processing and retrieval are performed by a server computer. Performed on a fixed system.
  • a user uses the terminal unit 100.
  • An object to be searched is imaged by the image obtaining means 1 to obtain an obtained image 2.
  • the acquired image 2 is confirmed on the monitor 5 and transmitted to the processing section 200 via the communication means 101.
  • the processing unit 200 receives the acquired image 2 via the communication unit 201.
  • the representative color image creating means 3 extracts a representative color from the acquired image 2 and creates a representative color image 4 as in the first embodiment.
  • the representative color image 4 is sent to the terminal unit 100 via the communication means 201.
  • the representative color image 4 received via the communication means 101 is displayed on the monitor 5.
  • the user uses the pointing means 6 while observing the monitor 5 to specify at least one pixel for each of the positions (xi, yi) of the object area to be searched in the representative color image 4.
  • the target area designation means 102 transmits this position information to the processing section 200 via the communication means 101.
  • the processing section 200 obtains this position information via the communication means 201.
  • the object area registration means 7 refers to the representative color image 4 from the position information, and displays the representative color corresponding to the object.
  • the object search means 9 extracts features such as colors and textures from the acquired image 2 only for the object region in which the pixel value of the object region image 8 is '1', and Performs a search by comparing with the reference data of.
  • the search result is transmitted via the communication means 201 to the terminal. Sent to the last 100.
  • the terminal unit 100 sends the search result received via the communication unit 101 to the search result display unit 103.
  • the search result display means 103 displays the name or the like of the searched object on the monitor 5.
  • a user who has acquired an image using a mobile terminal with a camera or a mobile phone sends the acquired image to a server, performs an advanced image search process in the server, and sends the search result to the terminal to send the user a search result.
  • a server performs an advanced image search process in the server, and sends the search result to the terminal to send the user a search result.
  • communication means 101 and 201 are required as compared with the first embodiment, and the same acquired image 2 and representative color image 4 need to be held in both the terminal unit 100 and the processing unit 200. There seems to be redundant.
  • the terminal unit 100 is usually a mobile phone or a mobile terminal, and often includes a computer with low power consumption and low processing capacity. Since the representative color extraction processing and feature extraction processing performed in the present invention have a large processing amount, it may be difficult for a computer having a low processing capacity of the terminal unit 100 to execute high-performance processing at high speed. For this reason, in the present embodiment, only simple processing for image acquisition and target area designation is performed at the terminal unit 100! (4) Processing is executed by the processing unit 200, and high-performance processing can be executed at high speed.
  • FIG. 13 the same components as those shown in FIG. 2 are denoted by the same reference numerals as those in FIG.
  • the third embodiment of the present invention includes a computer 300 that operates under program control, an image acquisition unit 1, a monitor 5, and a pointing unit 6.
  • the computer 300 further includes an arithmetic processing unit 301, a storage unit 302, and an interface unit (IZF) 303-303.
  • IZF interface unit
  • I / F303, 303, 303 are outside the computer 110
  • An image processing program 305 for controlling the operation of the computer 300 is provided in a state where it is recorded on an optical magnetic disk, a semiconductor memory, or another recording medium 306.
  • This recording medium 3 When the 06 is connected to the IZF 303, the arithmetic processing unit 301
  • the processing program 305 is read and stored in the storage unit 302. Thereafter, the arithmetic processing unit 301 executes the image processing program 305 stored in the storage unit 302, and executes the representative color image creating unit 3, the object region registering unit 7, and the object searching unit 9 shown in FIG. 2 and FIG. Realize. Note that the image processing program 305 is provided via a data communication network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)
  • Facsimiles In General (AREA)
  • Image Processing (AREA)

Abstract

Une image (2) dans laquelle apparaît un sujet est acquise (étape S1). L'image acquise (2) est divisée, sur la base de la composition des couleurs, en une pluralité de zones, et une image de couleurs représentatives (4) est produite par remplacement des couleurs de ces zones par les couleurs représentant ces zones (étapes S2). L'image des couleurs représentatives (4) est affichée sur un écran de contrôle conjointement à l'image acquise (2) (étape S3). L'utilisateur observe l'image acquise (2) et l'image des couleurs représentative (4) pour désigner des zones, dans l'image des couleurs représentatives (4), correspondant au sujet (étape S4). Les zones désignées sont traitées en tant que zones du sujet (étape S5). Ceci permet à l'utilisateur de désigner facilement une partie sujet dans une image, et entraîne ainsi une réduction du travail de l'utilisateur lorsque celui-ci découpe le sujet dans l'image.
PCT/JP2004/016946 2003-11-17 2004-11-15 Procede de traitement d'image, systeme de traitement d'image et programme de traitement d'image Ceased WO2005048133A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003-386373 2003-11-17
JP2003386373A JP2007052466A (ja) 2003-11-17 2003-11-17 画像処理方法、画像処理システム、および画像処理プログラム

Publications (1)

Publication Number Publication Date
WO2005048133A1 true WO2005048133A1 (fr) 2005-05-26

Family

ID=34587396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/016946 Ceased WO2005048133A1 (fr) 2003-11-17 2004-11-15 Procede de traitement d'image, systeme de traitement d'image et programme de traitement d'image

Country Status (2)

Country Link
JP (1) JP2007052466A (fr)
WO (1) WO2005048133A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8129403B2 (en) 2005-02-16 2012-03-06 Astrazeneca Ab Chemical compounds
CN102810204A (zh) * 2012-07-03 2012-12-05 天津大学 基于平行四边形的单目视觉单幅图像定位方法
CN105303523A (zh) * 2014-12-01 2016-02-03 维沃移动通信有限公司 一种图像处理方法及移动终端

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009290694A (ja) * 2008-05-30 2009-12-10 Fujifilm Corp 撮像装置
JP2012118448A (ja) * 2010-12-03 2012-06-21 Sony Corp 画像処理方法、画像処理装置及び画像処理プログラム
JP6089886B2 (ja) 2013-03-29 2017-03-08 オムロン株式会社 領域分割方法および検査装置
KR101764998B1 (ko) * 2016-02-16 2017-08-23 라인 가부시키가이샤 이미지 필터링 방법 및 시스템

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH021076A (ja) * 1988-01-28 1990-01-05 Nec Corp カラー画像の限定色表現方法及び装置
JP2002245456A (ja) * 2001-02-19 2002-08-30 Nec Corp 画像特徴量生成装置および方法、ならびに画像特徴量生成プログラムを記録した記憶媒体
JP2003067764A (ja) * 2001-08-23 2003-03-07 Nippon Telegr & Teleph Corp <Ntt> 画像処理方法及び装置と、画像検索方法及び装置と、画像処理プログラム及びそのプログラムの記録媒体と、画像検索プログラム及びそのプログラムの記録媒体
JP2003122758A (ja) * 2001-10-11 2003-04-25 Canon Inc 画像検索方法及び装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH021076A (ja) * 1988-01-28 1990-01-05 Nec Corp カラー画像の限定色表現方法及び装置
JP2002245456A (ja) * 2001-02-19 2002-08-30 Nec Corp 画像特徴量生成装置および方法、ならびに画像特徴量生成プログラムを記録した記憶媒体
JP2003067764A (ja) * 2001-08-23 2003-03-07 Nippon Telegr & Teleph Corp <Ntt> 画像処理方法及び装置と、画像検索方法及び装置と、画像処理プログラム及びそのプログラムの記録媒体と、画像検索プログラム及びそのプログラムの記録媒体
JP2003122758A (ja) * 2001-10-11 2003-04-25 Canon Inc 画像検索方法及び装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAWAKAMI H. ET AL: "Iro to Keijo no Joho o Mochiita Doro Hyoshiki Kenshutsu", THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS GIJUTSU KENKYU HOKOKU, vol. 92, no. 227, 18 September 1992 (1992-09-18), pages 33 - 40, XP002989339 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8129403B2 (en) 2005-02-16 2012-03-06 Astrazeneca Ab Chemical compounds
CN102810204A (zh) * 2012-07-03 2012-12-05 天津大学 基于平行四边形的单目视觉单幅图像定位方法
CN105303523A (zh) * 2014-12-01 2016-02-03 维沃移动通信有限公司 一种图像处理方法及移动终端

Also Published As

Publication number Publication date
JP2007052466A (ja) 2007-03-01

Similar Documents

Publication Publication Date Title
US11281712B2 (en) System, apparatus, method, program and recording medium for processing image
US10528820B2 (en) Colour look-up table for background segmentation of sport video
US8094935B2 (en) Representative color extracting method and apparatus based on human color sense and data histogram distributions
US20090245571A1 (en) Digital video target moving object segmentation method and system
EP3249606B1 (fr) Procédé de mappage de tonalité inverse et dispositif correspondant
KR20070079330A (ko) 표시 제어장치 및 방법, 컴퓨터 프로그램, 및 기록 매체
US7652717B2 (en) White balance correction in digital camera images
KR100422709B1 (ko) 영상 의존적인 얼굴 영역 추출방법
JP5164692B2 (ja) 画像処理装置、画像処理方法、及び、プログラム
CN114913463B (zh) 一种图像识别方法、装置、电子设备及存储介质
JP2011199671A (ja) 画像処理装置および方法、並びにプログラム
CN107204034A (zh) 一种图像处理方法及终端
JP2004310475A (ja) 画像処理装置、画像処理を行う携帯電話、および画像処理プログラム
CN112102207A (zh) 一种确定温度的方法、装置、电子设备及可读存储介质
WO2005048133A1 (fr) Procede de traitement d&#39;image, systeme de traitement d&#39;image et programme de traitement d&#39;image
JP6977483B2 (ja) 画像処理装置、画像処理方法、画像処理システムおよびプログラム
US10026201B2 (en) Image classifying method and image displaying method
WO2003063082A1 (fr) Appareil de recherche concernant une image animee
KR20230067216A (ko) 캡처 이미지 디스플레이 제어장치
JP2002208013A (ja) 画像領域抽出装置及び画像領域抽出方法
JPH06251147A (ja) 映像特徴処理方法
CN113554615A (zh) 一种图像精细化处理方法、装置、电子设备及存储介质
CN118505593B (zh) 一种基于计算机视觉的图像检测系统
CN110266939B (zh) 显示方法及电子设备、存储介质
CN120147114B (zh) 一种图像智能缩放方法

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP