WO2016158274A1 - 画像処理装置 - Google Patents
画像処理装置 Download PDFInfo
- Publication number
- WO2016158274A1 WO2016158274A1 PCT/JP2016/057445 JP2016057445W WO2016158274A1 WO 2016158274 A1 WO2016158274 A1 WO 2016158274A1 JP 2016057445 W JP2016057445 W JP 2016057445W WO 2016158274 A1 WO2016158274 A1 WO 2016158274A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- dark part
- image
- dark
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/02—Stereoscopic photography by sequential recording
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/36—Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/407—Control or modification of tonal gradation or of extreme levels, e.g. background level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/409—Edge or detail enhancement; Noise or error suppression
- H04N1/4092—Edge or detail enhancement
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
Definitions
- the present invention relates to an image processing apparatus that generates an image.
- Patent Document 1 discloses an imaging apparatus capable of obtaining an image in a state where desired lighting is performed on a desired portion.
- the method described above is difficult to apply to moving objects because the subject needs to be imaged multiple times while changing the illumination position. For example, if the subject is a human being, it may blink, if the illumination is dazzling, the eyelid may be turned down, or the body may move without the subject being aware of it. Since the subject is displaced between the images, it is difficult to obtain a natural composite image.
- the present invention has been invented in view of the above problems, and an object of the present invention is to provide an image processing apparatus that can be easily applied even when a subject moves and can generate an image having a suitable stereoscopic effect. To do.
- an image processing apparatus includes a dark part pixel extraction unit that extracts dark part pixels of a target image, a dark part pixel correction unit that corrects dark part pixels so as to suppress a change in contrast of the target image, and A dark part model generation unit that generates a dark part model according to an image corrected in the dark part pixel correction unit, and a dark part model generated in the dark part model generation unit on an image corrected in the dark part pixel correction unit based on the dark part model And an image generation unit for applying dark portion pixels.
- the present invention it can be easily applied even when the subject moves, and an image having a suitable stereoscopic effect can be generated.
- FIG. 1 is a functional block diagram illustrating a configuration of an image display device including an image processing device according to a first embodiment of the present invention. It is a figure explaining the relationship between an attention pixel and a neighboring pixel. It is a figure explaining the parameter
- FIG. 1 is a functional block diagram illustrating a configuration of an image display apparatus 102 including an image processing apparatus 101 according to the first embodiment of the present invention.
- the image display apparatus 102 images a subject with the imaging unit 103, generates an image having a suitable stereoscopic effect from the captured image, and displays the generated image on the display unit 104. .
- the image display device 102 includes an imaging unit 103, a display unit 104, a storage unit 105, an image processing device 101, and an input / output unit 106.
- the imaging unit 103 includes an imaging lens and an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and can capture a still image or a moving image of the subject.
- an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and can capture a still image or a moving image of the subject.
- the display unit 104 is a display screen such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display, and displays information such as images and characters, an image of a subject, and the like.
- LCD Liquid Crystal Display
- organic EL Electro Luminescence
- the image processing apparatus 101 can be configured by, for example, a CPU (Central Processing Unit), a GPU (Graphic Processing Unit), or the like. Further, the image processing apparatus 101 processes an image acquired from the imaging unit 103 or the storage unit 105 based on a user instruction acquired from the input / output unit 106, and processes the image to at least one of the display unit 104 and the storage unit 105. The later image is output.
- a CPU Central Processing Unit
- GPU Graphic Processing Unit
- the image processing apparatus 101 also includes a dark part pixel extraction unit 107, a dark part pixel correction unit 108, a dark part model generation unit 109, and an image generation unit 110. Each of these units performs the following processes. ⁇ Dark part pixel extraction processing>
- the dark part pixel extraction unit 107 extracts one or a plurality of dark part pixels from the image acquired by the image processing apparatus 101 from the imaging unit 103 or the storage unit 105 described above.
- the dark portion pixel is a pixel constituting a shadow appearing in the subject in the acquired image, and is a portion where there is little change in luminance or color.
- the dark part pixel correction unit 108 generates a corrected image by correcting at least one of the one or a plurality of dark part pixels extracted by the dark part pixel extraction unit 107 so as to suppress a change in contrast of the image. Details of the dark pixel correction processing will be described later.
- the dark part model generation unit 109 generates a dark part model corresponding to the subject based on the dark part pixels extracted by the dark part pixel extraction unit 107. Details of the dark part model generation processing and the dark part model will be described later.
- ⁇ Image generation processing> Based on the image (corrected image) corrected by the dark part pixel correction unit 108 and the dark part model generated by the dark part model generation part, the image generation unit 110 generates an image obtained by adding dark part pixels to the corrected image. To do. Details of the image generation processing will be described later.
- the storage unit 105 is, for example, a flash memory or a hard disk, and stores an image and a dark part pixel template that is a basis of a dark part model, and stores device-specific data.
- the input / output unit 106 uses a key button, a microphone, a speaker, or the like to input a user instruction to the image processing apparatus 101 or output a voice or the like input from the image processing apparatus 101.
- the above is the system configuration of the first embodiment.
- FIG. 2 is a diagram for explaining the relationship between the target pixel 201 in the target image that is the processing target image and the neighboring pixel 202 positioned in the vicinity of the target pixel 201.
- FIG. 3 is a diagram illustrating an index when a pixel of interest is extracted as a dark pixel using a color difference and a luminance difference.
- FIG. 4 is a diagram illustrating the relationship between the target pixel and the neighboring pixels when there are a plurality of neighboring pixels.
- the dark part pixel extraction unit 107 determines whether the target pixel 201 is a dark part pixel based on the color difference and luminance difference between the target pixel 201 and the neighboring pixel 202. When the target pixel 201 is determined as a dark pixel, the dark pixel extraction unit 107 extracts the target pixel 201 as a dark pixel.
- the dark part pixel extracting unit 107 calculates the color difference s h between the target pixel 201 and neighboring pixels 202 using equation (1).
- N (i, j) ( Nr (i, j) , Ng (i, j) , Nb (i, j) ) is the pixel value of the (i, j) pixel in the target image I.
- I (i, j) is regarded as a three-variable vector and normalized.
- the pixel value is represented by three values of RGB values, that is, R (Red), G (Green), and B (Blue), and each takes a value of 0 or more and 1 or less.
- the variable i represents the position in the x-axis direction in the image.
- the variable j represents the position in the y-axis direction in the image, and ranges from 0 to the number of vertical pixels of the image minus one. If the target pixel 201 is the (i, j) pixel, the neighboring pixel 202 is the (i + 1, j) pixel, and the pixel value of the neighboring pixel 202 is I (i + 1, j) .
- the pixel value I (i + 1, j) is regarded as a vector of three variables and normalized, and N (i + 1, j) is assumed. Chrominance s h takes 0 or 1 or less, indicating that the more color shift is less close to zero.
- the color difference s h is minimized at 0.
- the target pixel 201 neighboring pixel 202 were red achromatic color difference s h becomes a value close to 0.9 and a maximum value.
- the color difference between the target pixel 201 and the neighboring pixel 202 is expressed using the angle formed by the vector when the RGB values of the target pixel 201 and the neighboring pixel 202 are regarded as vectors.
- the dark pixel extraction unit 107 calculates the luminance difference s y between the target pixel 201 and the neighboring pixel 202 using Equation (2).
- the luminance of the target pixel 201 is y (i, j)
- the luminance of the neighboring pixel 202 is y (i + 1, j)
- the luminance difference is normalized to take a value of ⁇ 1 or more and 1 or less, and 0 when the pixel of interest 201 and the neighboring pixel 202 have the same luminance value, 0 when the pixel of interest 201 is black, that is, the neighboring pixel
- the luminance difference is a maximum value of 1.
- the dark part pixel extraction unit 107 performs a conversion process from the RGB value to the luminance y using Expression (3).
- the dark portion pixel extraction unit 107 may be configured to use an average value of RGB values (I r + I g + I b) / 3 and I g value as the luminance of the pixels.
- the dark pixel extraction unit 107 performs dark pixel extraction processing based on at least one of the color difference and the luminance difference between the target pixel 201 and the neighboring pixel 202 calculated by the above-described method.
- FIG. 3 shows an index when the target pixel 201 is extracted as a dark pixel using the color difference and the luminance difference.
- the dark area pixel extraction unit 107 determines that the target pixel 201 is not a dark area. This is because the luminance of the neighboring pixel 202 is lower than the luminance of the pixel of interest 201, and there is a high possibility that the neighboring pixel 202 is a dark area.
- the dark pixel extraction unit 107 assumes that the pixel of interest 201 is not a dark region. This is because a portion where both the color and the brightness greatly vary is likely to be the contour of the subject.
- the dark area pixel extraction unit 107 extracts the target pixel 201 as the first dark area pixel.
- Each threshold value related to the color difference and the luminance difference is calculated by analyzing the portion where the dark pixel and the non-dark pixel of the subject are in contact in advance in the dark pixel extraction unit 107, or the user setting input from the input / output unit It can be set using a method of calculating from the value. For example, when the user's setting value is “shade enhancement”, the dark pixel extraction unit 107 sets each threshold value large, and when it is “shading weakness”, it is optimal by setting each threshold value small. Threshold can be set.
- the threshold setting method is not limited to the above example, and a known method may be applied.
- the target pixel is installed on the right side of the image and the neighboring pixel is installed on the left side of the image. Therefore, the dark pixel extraction unit 107 has a shadow that gradually darkens from the left side to the right side of the image.
- the neighboring pixels 402 are set to eight pixels located diagonally above, below, left, and right with respect to the target pixel 401. When the neighboring pixels 402 are eight pixels, it is calculated whether the pixel of interest 401 is extracted as a dark pixel for each of the pixels of interest 401 and the neighboring pixels 402.
- the dark part pixel extraction unit 107 first sets a group in which a certain element is the target pixel 401 and another element is any one of the plurality of neighboring pixels 402 for each of the plurality of neighboring pixels 402. For example, when the neighboring pixels 402 are eight pixels, if the neighboring pixels are represented as NP1 to NP8 and the pixel of interest is represented as IP, the dark portion pixel extraction unit 107 performs (NP1, IP), (NP2, IP ),..., (NP8, IP) are set in total.
- the dark part pixel extraction unit 107 determines whether or not the target pixel 401 is a dark part pixel by using a result obtained by weighting each determination result of determining whether or not each neighboring pixel is a dark part pixel. It doesn't matter.
- priority is given to the determination of the neighboring pixel 402 positioned in the vertical direction of the pixel of interest 401 by weighting using a precondition that a shadow often darkens from top to bottom. Can do. Thereby, in many images, erroneous extraction of dark area pixels can be suppressed, which is preferable.
- the dark part pixel extraction unit 107 weights the result of determining whether or not each neighboring pixel is the first dark part pixel, and determines whether or not the target pixel 401 is the dark part pixel.
- the dark pixel extraction unit 107 totals all the weighted results of the color difference and luminance difference calculated for each neighboring pixel, and uses the total value and the threshold value described above to determine whether the pixel of interest 401 is a dark pixel. It may be determined.
- the luminance difference between the target pixel IP and the neighboring pixel NP1 is s y1
- the color difference is s h1
- the target pixel IP and the neighboring pixel NP2 are When the luminance difference of s y2 , the color difference is s h2 ,...,
- the luminance difference between the target pixel IP and the neighboring pixel NP8 is s y8
- the color difference is s h8
- the dark pixel extraction unit 107 sums up the luminance difference
- the value sy_sum and the color difference total value sh_sum may be calculated by the equations (4) and (5).
- k represents a range of neighboring pixels
- ⁇ represents a weighting coefficient for the luminance value
- ⁇ represents a weighting coefficient for the color difference.
- ⁇ and ⁇ are set as positive values. If the weighting coefficients ⁇ and ⁇ are all set to 1, the sum of each of the luminance difference and the color difference is calculated as the summation results sy_sum and sh_sum .
- Dark portion pixel extraction can be controlled by setting ⁇ and ⁇ to different values. For example, when the weighting coefficient ⁇ is set to a value close to 1 and ⁇ is set to a value close to 0, such as 0.01, dark portion pixel determination can be performed by placing more importance on the luminance difference than the color difference. Even if the subject has a complex texture, it can be considered that the subject is composed of a single color when viewed in the local region, and therefore, it is effective when the local regions set for the subject are sufficiently small.
- the dark part pixel extraction processing by the dark part pixel extraction unit 107 is performed.
- FIG. 5 is a diagram for explaining the relationship between the variance of the luminance difference and the weight.
- the dark part pixel correction unit 108 performs correction so as to reduce a difference in pixel values between the dark part pixel and a pixel other than the dark part pixel and to suppress a change in contrast of the image.
- the dark part pixel correction unit 108 performs dark part pixel correction processing on the dark part pixels extracted by the dark part pixel extraction unit 107. Specifically, when the target pixel I (i, j) is a dark part pixel, the image (corrected image) after the dark part pixel correction is set as I ′, and the vicinity of N ⁇ M is set as the vicinity pixel of the first dark part pixel. When used, the corrected image I ′ is calculated using equations (6) and (7).
- the dark pixel correction unit reduces the difference in pixel values between the dark pixel and the pixels other than the dark pixel, and suppresses a change in contrast.
- the weight ⁇ kl is a Gaussian (normal distribution) type weight coefficient. The value of the weight ⁇ kl increases when the luminance difference between the dark pixel and the neighboring pixel is small. That is, the dark part pixel correction unit 108 increases the weight because the dark part pixel and the neighboring pixels having a small luminance difference are likely to be shadows.
- the dark part pixel correction unit 108 weakens the weight because the dark part pixel and the neighboring pixels having a large luminance difference are likely to be the contour of the subject.
- ⁇ in Equation (7) represents the variance of the luminance difference between the dark pixel and the neighboring pixel.
- the dark pixel correction unit 108 can control how much the difference in luminance between the pixel of interest and the neighboring pixels is allowed by changing the value of the variance ⁇ .
- the dark part pixel correction unit 108 can control the degree of propagation of pixel values other than the dark part pixels to the dark part pixels by changing the value of the variance ⁇ .
- amendment part 108 can control the degree of smoothing, and can adjust the degree which suppresses the change of contrast.
- the horizontal axis 501 represents the luminance difference
- the vertical axis 502 represents the weight ⁇ .
- Increasing the variance ⁇ increases the degree of propagation as shown by the curve 503, but makes it impossible to distinguish between the contour of the subject and the non-contour and blurs the contour of the subject.
- the degree of propagation becomes weak as shown by the curve 504, and blurring of the contour of the subject can be prevented, but the degree of suppression of contrast change is reduced.
- the dark part pixel correction unit 108 sets a small value for the variance ⁇ and repeatedly performs the dark part pixel correction process.
- the dark part pixel correction unit 108 sets a small value for the variance ⁇ and repeatedly performs the dark part pixel correction process.
- the dispersion ⁇ may be set to 20.
- the dark part pixel correction processing by the dark part pixel correction unit 108 is performed.
- FIG. 6 is a diagram for explaining an example in which a shadow that appears when a human face is lit is used as a dark part pixel template.
- the dark part model generation unit 109 generates a dark part model based on the dark part pixels extracted by the dark part pixel extraction unit 107.
- the dark part model is a model composed of dark part pixels to be given to a subject.
- the dark part pixel generation unit 109 corrects the dark part pixels extracted by the dark part pixel extraction unit 107 to be bilaterally symmetric and uses the corrected dark part pixels as a dark part model.
- An image with improved face symmetry can be generated. Since the dark part model suitable for the subject can be generated by using the dark part pixel extracted from the subject, the image quality of the image generated in the image generation unit can be improved, which is preferable.
- the dark part model generation unit 109 can generate a dark part model by appropriately duplicating and rearranging the dark part pixels extracted by the dark part pixel extraction unit 107.
- the dark part model generation unit is configured to generate a dark part model based on the dark part pixel template stored in the storage unit 105, it is preferable because a dark part model subjected to various lighting methods can be easily generated.
- the dark part pixel template is a shadow part pixel extracted as a shadow part pixel such as a split light template 601, a Rembrandt light template 602, a butterfly light template 603, or a loop light template 604. is there.
- the dark part model generation unit 109 generates a dark part model S based on these dark part pixel templates T using Expression (8).
- the function f is a function for correcting the dark part pixel template T so as to match the image I ′ after dark part pixel correction.
- the dark part model generation unit 109 can change the application position and the size of the dark part model S on the image I ′.
- the dark part model S is arranged at the center of the image I ′, and the size is half of the image.
- the position and size of the dark part model are changed, and the position and size of the dark part model S on the image I 'are adjusted. For example, when the generated dark part model is generated larger than the subject, the user reduces the dark part model by issuing an instruction to perform the dark part model reduction process.
- the user changes the position of the dark part model by issuing an instruction to perform the dark part model moving process.
- the dark part model generation unit 109 automatically changes the position and size of the dark part model according to the skin color distribution in the image.
- the dark part model is provided at an appropriate position without adjustment by the user, which is preferable.
- the dark part model generation process by the dark part model generation unit 109 is performed.
- FIG. 7 is a diagram for explaining dark part pixel assignment based on the dark part model.
- the image generation unit 110 generates an image O to which dark part pixels are added from the dark part model S and an image (corrected image) I ′ obtained by correcting the dark part pixels, using Expression (9).
- the image O may be called an image to which the dark part model S is applied or an image to which the dark part model S is added.
- the dark part model S takes a value from 0 to 1 for each pixel. If the value S (i, j) of the dark part model S of the (i, j) pixel is 1, the pixel value O (i, j) of the generated image O and the pixel value I of the image I ′ obtained by correcting the dark part pixel '(I, j) is the same value. If the value S (i, j) of the dark part model S of the (i, j) th pixel is 0, the pixel value O (i, j) of the generated image O is 0. That is, the closer the value of the dark part model S is to 0, the darker the pixel value of the generated image O, and the closer to 1, the closer the pixel value of the generated image O is to the image I ′ obtained by correcting the dark part pixels.
- the image generation unit 110 may correct the brightness of the generated image in consideration of how human colors are seen.
- One of the human color appearances related to brightness is the color area effect.
- the color area effect is a phenomenon in which the appearance of a color changes depending on the size of the area to be presented, even if they are physically the same color. As the area to be presented increases, the dark area pixels feel darker. Therefore, the image generation unit 110 preferably changes the brightness of the dark area model so as to cancel the color area effect.
- the image generation unit 110 configures the brightness of the dark part model (the dark part model is configured) according to the sum of the pixels on the image to which the dark part model is added (in other words, the total number of pixels overlapping the dark part model). It is preferable to generate an image using the changed dark part model.
- the brightness of the dark part model is increased as the sum of the pixels is increased, and the brightness of the dark part model is decreased as the sum is decreased. This is preferable because a natural dark part model can be applied in consideration of human color appearance.
- the image generation unit 110 applies the dark part model and then determines the brightness after applying the dark part model according to the sum of the pixels to which the dark part model is applied (in other words, the total number of pixels overlapping the dark part model). It is good also as a structure to change.
- the image generation unit 110 may correct the hue of the generated image in accordance with how the human color appears.
- One of the human color appearances related to hue is the Besolt-Bschreibe phenomenon.
- the Besolt-Bschreibe phenomenon is a phenomenon in which the hue changes depending on the brightness. Specifically, when the brightness is lowered, red purple and orange shift to red, and yellow green and blue green shift to green.
- the image generation unit 110 changes the hue of the dark part model according to the brightness of the dark part model and the hue of the pixel on the image to which the dark part model is added.
- the image generation unit 110 determines that the shadow effect is stronger as the brightness of the dark part model becomes smaller, and increases the hue change of the dark part model.
- the image generation unit 110 determines that the shadow effect is weaker as the brightness of the dark part model increases, and reduces the change in the hue of the dark part model. In addition, if the hue of the pixel on the image to which the dark part model is added is included from magenta to orange, the image generation unit 110 changes the hue of the dark part model so that red becomes stronger. If the hue of the pixel on the image to which the dark part model is added is included from yellowish green to blue-green, the image generation unit 110 changes the hue of the dark part model so that green becomes strong. When the subject to which the dark part model is added is a human face, the skin color hue is likely to be included from magenta to orange. Therefore, in the image generation unit 110, when the hue of the dark part model is changed so that red becomes strong. It is preferable that a natural dark part model can be imparted with little uncomfortable feeling.
- the above-described processing can provide a dark part model that takes into account the appearance of human colors, which is preferable because it can improve the generated image quality.
- the image generation unit 110 may be configured to change the hue of the image after applying the dark part model according to the brightness of the dark part model and the hue of the pixel on the image to which the dark part model is applied.
- the image generation processing by the image generation unit 110 is performed.
- step S ⁇ b> 801 the image processing apparatus 101 captures an image from the imaging unit 103 or the storage unit 105.
- step S802 the dark part pixel extraction unit 107 extracts dark part pixels from the image captured in step S801 (corresponding to the dark part pixel extraction process described above).
- step S803 the dark part pixel correction unit 108 corrects the dark part pixel extracted in step S802 (corresponding to the dark part pixel correction process described above).
- step S804 a dark area pixel template is acquired from the storage unit 105.
- step S805 the dark part model generation unit 109 generates a dark part model using the dark part pixel template acquired in step S804 (corresponding to the dark part model generation process described above).
- step S806 using the dark part model generated in step S805, the image generation unit 110 assigns a dark part model to the image after dark part pixel correction, and changes the illumination position to capture the image.
- An image is generated (corresponding to the image generation process described above).
- step S807 the image generation unit 110 outputs the image generated in step S806 to the display unit 104.
- the above is the operation flow of the image processing apparatus 101. As described above, the image display apparatus 102 according to the first embodiment operates.
- an image having a suitable three-dimensional effect is displayed by adding a desired shadow after correcting dark pixels that are not desired by the user. Can do.
- it is not necessary to perform a process of imaging the subject a plurality of times while changing the illumination position, so that it can be easily applied to a moving object.
- FIG. 9 the same components as those in FIG. 1 are denoted by the same reference numerals, and these elements perform the same processing as in the embodiment of FIG.
- the image display apparatus 102 is based on a face detection unit 903 that detects face size information and the face size information detected by the face detection unit 903.
- a face detection unit 903 that detects face size information and the face size information detected by the face detection unit 903.
- FIG. 10 is a diagram for explaining the size of the detected face.
- a face detection unit 903 detects the size of the face of the subject from the image.
- the size of the face means the number of horizontal pixels W1001 and the number of vertical pixels H1002 of the detected face area.
- the method of detecting the size of the face from the image is a method of detecting the skin color to identify the face area, or statistically obtaining an identification function from a large number of face images and learning samples of non-face images (non-face), Method for detecting face position information and face size information (P. Viola and M. Jones, “Rapid object detection using a boost of simple features”, Proc. IEEE Conf. 1 CVPR5, pp. 51, 2001. ) Is known and can be realized by using the method described above. Thus, the size of the face is detected.
- ⁇ Neighboring pixel range determination processing> Next, the operation of the neighboring pixel range determination unit 904 will be described in detail. Here, the size of the neighboring pixels used for dark part pixel extraction is determined.
- the neighboring pixel range determination unit 904 enlarges the neighboring pixel range. As the size of the subject's face becomes smaller, the shadows that are produced also become smaller, so the neighboring pixel range determination unit 904 makes the neighboring pixel range smaller.
- increasing the neighboring pixel range means expanding the neighboring pixel range to kN ⁇ kM
- the neighboring pixel range determining unit 904 may change the shape of the neighboring pixel range according to the size of the subject's face. For example, when the size of the face of the subject is substantially the same in length and width, it is determined that the face of the subject is rounded, and a circular neighboring pixel range is set.
- the shape of the neighboring pixel range is not limited to a square, a rectangle, or a circle, but may be a combination of other pixels such as a cross shape represented by four neighborhoods in the upper, lower, left, and right directions, and a cross shape represented by only four neighboring neighborhoods.
- step S1101 the image processing apparatus 901 captures an image from the imaging unit 103 or the storage unit 105.
- the face detection unit 903 detects the size of the face from the image captured in step 1101. This process corresponds to the face detection process described above.
- the neighboring pixel range determination unit 904 determines a neighboring pixel range based on the face size detected in step S1102. This process corresponds to the neighborhood pixel range determination process described above.
- the dark part pixel extraction unit 107 extracts dark part pixels from the image captured in step S1101 and the neighboring pixel range determined in step S1103.
- step S1105 the dark part pixel correction unit 108 corrects the dark part pixels extracted in step S1104.
- step S ⁇ b> 1106 a dark part pixel template is acquired from the storage unit 105.
- step S1107 the dark part model generation unit 109 generates a dark part model using the dark part pixel template acquired in step S1106.
- step S1108, using the dark part model generated in step S1107 the image generation unit 110 assigns a dark part model to the image after dark part pixel correction, and changes the illumination position to capture an image. A simple image.
- the image generation unit 110 outputs the image generated in step S1108 to the display unit 104.
- the above is the operation flow of the image processing apparatus 901. As described above, the image display device 902 of the second embodiment operates.
- the range of the dark pixel to be extracted is controlled based on the size of the subject. Therefore, the dark pixel that is unnecessary while leaving the preferred dark pixel.
- a desired dark portion pixel can be added to the image corrected for the above, and an image having a suitable stereoscopic effect can be displayed.
- the program that operates in the image processing apparatus according to the present invention may be a program that controls a CPU or the like (a program that causes a computer to function) so as to realize the functions of the above-described embodiments relating to the present invention.
- Information handled by these devices is temporarily stored in a RAM (Random Access Memory) during processing, and then stored in various ROMs and HDDs such as a ROM (Read Only Memory). Reading, correction and writing are performed.
- the “computer system” here includes an OS (Operating System) and hardware such as peripheral devices.
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
- a volatile memory in a computer system serving as a server or a client in that case, and a program that holds a program for a certain period of time are also included.
- a part or all of the image processing apparatus in the above-described embodiment may be realized as an LSI that is typically an integrated circuit.
- Each functional block of the image processing apparatus may be individually formed into chips, or a part or all of them may be integrated into a chip.
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- an integrated circuit based on the technology can also be used.
- control lines and the information lines indicate what is considered necessary for the description, and do not necessarily indicate all the control lines and information lines on the product. All the components may be connected to each other.
- the image processing apparatus suppresses a change in contrast of the target image by using the dark part pixel extracting unit 107 that extracts one or more dark part pixels from the target image and the one or more dark part pixels.
- a dark part pixel correction unit 108 that generates a corrected image by correcting the dark part model
- a dark part model generation unit 109 that generates a dark part model based on the dark part pixels
- an image generation unit 110 that applies dark portion pixels constituting the dark portion model to the corrected image.
- the dark part pixels constituting the dark part model are added to the corrected image generated by correcting the one or more dark part pixels so as to suppress the change in contrast of the target image.
- An image having a suitable stereoscopic effect can be generated. Further, the above configuration can be easily applied even when the subject moves.
- the dark part pixel extraction unit 107 includes at least a color difference and a luminance difference between a target pixel in the target image and a neighboring pixel located in the vicinity of the target pixel. Depending on which one, it is preferable to extract the pixel of interest as a dark pixel.
- the dark part pixel extraction unit 107 has the color difference between the target pixel and the neighboring pixel smaller than a threshold relating to the color difference, and the luminance of the neighboring pixel
- the luminance difference obtained by subtracting the luminance is a positive value and smaller than a threshold value related to the luminance difference, it is preferable to extract the target pixel as a dark pixel.
- the image generation unit 110 determines the brightness of the dark part model according to the total number of pixels on the corrected image to which the dark part model is applied. Or it is preferable to change the brightness of the image after applying the dark part model.
- the image generation unit 110 corresponds to the brightness of the dark part model and the hue of the pixel on the corrected image to which the dark part model is added.
- the hue of the dark part model or the hue of the image after the dark part model is applied is changed.
- a face detection unit 903 that detects face size information of a subject from an image, and the dark part pixel extraction unit 107 from the face size information.
- a neighboring pixel range determining unit 904 that determines a range that can be taken by neighboring pixels, and the dark part pixel extracting unit 107 preferably uses pixels in the determined range as neighboring pixels.
- the image quality of the generated image can be improved.
- a dark part pixel extraction step of extracting one or a plurality of dark part pixels from the target image, and the one or the plurality of dark part pixels are configured to suppress a change in contrast of the target image.
- a dark portion pixel correcting step for generating a corrected image by correcting the dark portion model, a dark portion model generating step for generating a dark portion model based on the dark portion pixel, and a dark portion pixel constituting the dark portion model for the corrected image.
- an image generation step for generating a corrected image by correcting the dark portion model, a dark portion model generating step for generating a dark portion model based on the dark portion pixel, and a dark portion pixel constituting the dark portion model for the corrected image.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Electromagnetism (AREA)
- Software Systems (AREA)
- Nonlinear Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Image Processing (AREA)
Abstract
Description
以下、添付図面を参照して本発明の実施形態1について説明する。なお、添付図面は本発明の原理に則った具体的な実施形態を示しているが、これらは本発明の理解のためのものであり、決して本発明を限定的に解釈するために用いられるものではない。また、各図における構成は、理解しやすいように誇張して記載しており、実際の間隔や大きさとは異なる。
<暗部画素抽出処理>
暗部画素抽出部107は、上述の撮像部103または記憶部105等より画像処理装置101が取得した画像から、1又は複数の暗部画素を抽出する。ここで暗部画素とは、取得した画像において、被写体に現れる陰影を構成する画素のことで、輝度や色の変化が少ない箇所のことである。例えば人間の顔における陰影は、顔の凹凸により生じるシミ、ほうれい線などのシワ、鼻の影及び光源からの光が遮られることにより額や頬などに現れる影などのことである。暗部画素抽出処理の詳細については後述する。
<暗部画素補正処理>
暗部画素補正部108は、暗部画素抽出部107で抽出された1又は複数の暗部画素の少なくとも何れかを、画像のコントラストの変化を抑制するように補正することによって補正画像を生成する。暗部画素補正処理の詳細については後述する。
<暗部モデル生成処理>
暗部モデル生成部109は、暗部画素抽出部107によって抽出された暗部画素を基に、被写体に応じた暗部モデルを生成する。暗部モデル生成処理及び暗部モデルの詳細については後述する。
<画像生成処理>
画像生成部110は、暗部画素補正部108において補正された画像(補正画像)と暗部モデル生成部において生成された暗部モデルとに基づいて、当該補正された画像に暗部画素を付与した画像を生成する。画像生成処理の詳細については後述する。
次に、本発明の第2実施例に係る画像処理装置901を備える画像表示装置902の構成について、図9を用いて説明する。図9において、図1と同じ構成要素には同じ番号を付しており、これらの要素は図9の実施例と同じ処理を行うため説明を省略する。
<顔検出処理>
顔検出部903は、画像から、被写体の顔の大きさを検出する。ここで、顔の大きさとは、検出された顔領域の横画素数W1001と縦画素数H1002のことである。画像から顔の大きさを検出する方法は、肌色を検出して顔領域を特定する方法や、多数の顔画像と顔以外の画像(非顔)の学習サンプルから統計的に識別関数を求め、顔の位置情報と顔の大きさ情報を検出する方法(P.Viola and M.Jones,“Rapid object detection using a boosting cascade of simple features”,Proc.IEEE Conf.CVPR,pp.511-518,2001)が知られており、上述した方法を用いることで実現できる。以上により、顔の大きさの検出がなされる。
<近傍画素範囲決定処理>
次に、近傍画素範囲決定部904の動作について詳しく説明する。ここでは、暗部画素抽出で用いる近傍画素の大きさを決定する。被写体の顔の大きさが大きくなるに従って、生じる影もより大きくなるため、近傍画素範囲決定部904は近傍画素範囲をより大きくする。被写体の顔の大きさが小さくなるに従って、生じる影もより小さくなるため、近傍画素範囲決定部904は近傍画素範囲をより小さくする。近傍画素範囲がN×Mのとき、近傍画素範囲を大きくするとは近傍画素範囲をkN×kMに拡大することを指し、近傍画素範囲を小さくするとは近傍画素範囲をN/k×M/kに縮小することを指す(ここで、k>1であり、例えばk=2、3等の値が好適である)。これにより、被写体の顔の大きさが大きいにも関わらず小さな近傍画素範囲を用いて暗部画素抽出を行った結果、大きな影を構成する暗部画素を抽出できなかったり、被写体の顔の大きさが小さいにも関わらず大きな近傍画素範囲を用いて暗部画素抽出を行った結果、アイブロウ、アイラインや輪郭付近のシェーディングなどの好ましい印象を与える微小な陰影を暗部画素として抽出してしまったりすることを防ぐことができる。このように、上記構成によれば、好ましい暗部画素を残しながら不必要とされる暗部画素のみを補正することができる。
〔まとめ〕
本発明の態様1に係る画像処理装置は、対象画像から1又は複数の暗部画素を抽出する暗部画素抽出部107と、前記1又は複数の暗部画素を、上記対象画像のコントラストの変化を抑制するように補正することによって補正画像を生成する暗部画素補正部108と、前記暗部画素に基づいて暗部モデルを生成する暗部モデル生成部109と、
前記補正画像に対して、前記暗部モデルを構成する暗部画素を付与する画像生成部110と、を備えている。
102 画像表示装置
103 撮像部
104 表示部
105 記憶部
106 入出力部
107 暗部画素抽出部
108 暗部画素補正部
109 暗部モデル生成部
110 画像生成部
903 顔検出部
904 近傍画素範囲決定部
Claims (7)
- 対象画像から1又は複数の暗部画素を抽出する暗部画素抽出部と、
前記1又は複数の暗部画素を、上記対象画像のコントラストの変化を抑制するように補正することによって補正画像を生成する暗部画素補正部と、
前記暗部画素に基づいて暗部モデルを生成する暗部モデル生成部と、
前記補正画像に対して、前記暗部モデルを構成する暗部画素を付与する画像生成部と、
を備えている
ことを特徴とする画像処理装置。 - 前記暗部画素抽出部は、前記対象画像における注目画素と当該注目画素の近傍に位置する近傍画素との色差及び輝度差の少なくとも何れかに応じて、当該注目画素を暗部画素として抽出する
ことを特徴とする請求項1に記載の画像処理装置。 - 前記暗部画素抽出部は、注目画素と近傍画素との色差が色差に関する閾値よりも小さく、かつ近傍画素の輝度から注目画素の輝度を引いた輝度差が正の値でありかつ輝度差に関する閾値よりも小さい場合に注目画素を暗部画素として抽出することを特徴とする請求項2に記載の画像処理装置。
- 前記画像生成部は、前記暗部モデルを適用する前記補正画像上の画素の総数に応じて、前記暗部モデルの明るさ、または、前記暗部モデルを適用した後の画像の明るさを変化させることを特徴とする請求項1から3のいずれか1項に記載の画像処理装置。
- 前記画像生成部は、前記暗部モデルの明るさと前記暗部モデルを付与する前記補正画像上の画素の色相とに応じて、前記暗部モデルの色相、または、前記暗部モデルを適用した後の画像の色相を変化させることを特徴とする請求項1から4のいずれか1項に記載の画像処理装置。
- 画像から被写体の顔の大きさ情報を検出する顔検出部と、
前記顔の大きさ情報から前記暗部画素抽出部における近傍画素が取りうる範囲を決定する近傍画素範囲決定部と、
をさらに備え、
前記暗部画素抽出部は前記決定された範囲内の画素を近傍画素とすることを特徴とする請求項1から5のいずれか1項に記載の画像処理装置。 - 対象画像から1又は複数の暗部画素を抽出する暗部画素抽出ステップと、
前記1又は複数の暗部画素を、上記対象画像のコントラストの変化を抑制するように補正することによって補正画像を生成する暗部画素補正ステップと、
前記暗部画素に基づいて暗部モデルを生成する暗部モデル生成ステップと、
前記補正画像に対して、前記暗部モデルを構成する暗部画素を付与する画像生成ステップと、
を備えることを特徴とする画像処理方法。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017509480A JP6376673B2 (ja) | 2015-03-30 | 2016-03-09 | 画像処理装置 |
| US15/563,515 US10567670B2 (en) | 2015-03-30 | 2016-03-09 | Image-processing device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015070256 | 2015-03-30 | ||
| JP2015-070256 | 2015-03-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016158274A1 true WO2016158274A1 (ja) | 2016-10-06 |
Family
ID=57005606
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/057445 Ceased WO2016158274A1 (ja) | 2015-03-30 | 2016-03-09 | 画像処理装置 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10567670B2 (ja) |
| JP (1) | JP6376673B2 (ja) |
| WO (1) | WO2016158274A1 (ja) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005202807A (ja) * | 2004-01-19 | 2005-07-28 | Hitachi Software Eng Co Ltd | 陰影部輝度を補正可能なプログラム及び陰影部輝度補正方法 |
| JP2008004085A (ja) * | 2006-05-23 | 2008-01-10 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理方法、プログラム、記録媒体および集積回路 |
| JP2010211300A (ja) * | 2009-03-06 | 2010-09-24 | Toshiba Corp | 画像処理装置および方法 |
| JP2011078041A (ja) * | 2009-10-02 | 2011-04-14 | Sanyo Electric Co Ltd | 画像処理装置および電子カメラ |
| WO2012001949A1 (ja) * | 2010-06-30 | 2012-01-05 | 日本電気株式会社 | カラー画像処理方法、カラー画像処理装置およびカラー画像処理プログラム |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3505115B2 (ja) * | 1999-04-28 | 2004-03-08 | 富士通株式会社 | 画像加工装置及びプログラム記録媒体 |
| EP2280376B1 (en) * | 2002-02-12 | 2015-10-28 | Panasonic Intellectual Property Corporation of America | Image processing apparatus and image processing method |
| JP4060625B2 (ja) * | 2002-04-02 | 2008-03-12 | 東芝テック株式会社 | 画像形成装置と画像形成方法 |
| JP2005202469A (ja) * | 2004-01-13 | 2005-07-28 | Fuji Xerox Co Ltd | 画像処理装置、画像処理方法、およびプログラム |
| JP2005348237A (ja) * | 2004-06-04 | 2005-12-15 | Brother Ind Ltd | 画像濃度調整装置及びこの装置を備えた画像読取装置 |
| JP4448051B2 (ja) * | 2005-04-19 | 2010-04-07 | キヤノン株式会社 | 画像読取装置及び方法 |
| US7869649B2 (en) * | 2006-05-08 | 2011-01-11 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
| US8164594B2 (en) * | 2006-05-23 | 2012-04-24 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
| US8009903B2 (en) * | 2006-06-29 | 2011-08-30 | Panasonic Corporation | Image processor, image processing method, storage medium, and integrated circuit that can adjust a degree of depth feeling of a displayed high-quality image |
| JP2009098925A (ja) * | 2007-10-17 | 2009-05-07 | Sony Corp | 画像処理装置、画像処理方法、および、プログラム |
| CN102985943A (zh) * | 2010-06-30 | 2013-03-20 | 日本电气株式会社 | 彩色图像处理方法、彩色图像处理装置以及彩色图像处理程序 |
| JP5548954B2 (ja) | 2010-12-27 | 2014-07-16 | カシオ計算機株式会社 | 撮像装置、撮像方法及びプログラム |
-
2016
- 2016-03-09 US US15/563,515 patent/US10567670B2/en not_active Expired - Fee Related
- 2016-03-09 WO PCT/JP2016/057445 patent/WO2016158274A1/ja not_active Ceased
- 2016-03-09 JP JP2017509480A patent/JP6376673B2/ja not_active Expired - Fee Related
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005202807A (ja) * | 2004-01-19 | 2005-07-28 | Hitachi Software Eng Co Ltd | 陰影部輝度を補正可能なプログラム及び陰影部輝度補正方法 |
| JP2008004085A (ja) * | 2006-05-23 | 2008-01-10 | Matsushita Electric Ind Co Ltd | 画像処理装置、画像処理方法、プログラム、記録媒体および集積回路 |
| JP2010211300A (ja) * | 2009-03-06 | 2010-09-24 | Toshiba Corp | 画像処理装置および方法 |
| JP2011078041A (ja) * | 2009-10-02 | 2011-04-14 | Sanyo Electric Co Ltd | 画像処理装置および電子カメラ |
| WO2012001949A1 (ja) * | 2010-06-30 | 2012-01-05 | 日本電気株式会社 | カラー画像処理方法、カラー画像処理装置およびカラー画像処理プログラム |
Non-Patent Citations (1)
| Title |
|---|
| MASASHI BABA ET AL.: "In'eido ni Motozuku Iro Hosei ni yoru Jissha Gazo karano In'ei Jokyo", VISUAL COMPUTING GRAPHICS TO CAD GODO SYMPOSIUM 2003 YOKOSHU, 19 June 2003 (2003-06-19), pages 37 - 42 * |
Also Published As
| Publication number | Publication date |
|---|---|
| US20180097982A1 (en) | 2018-04-05 |
| US10567670B2 (en) | 2020-02-18 |
| JPWO2016158274A1 (ja) | 2017-12-28 |
| JP6376673B2 (ja) | 2018-08-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11250241B2 (en) | Face image processing methods and apparatuses, and electronic devices | |
| CN109639982B (zh) | 一种图像降噪方法、装置、存储介质及终端 | |
| KR101446975B1 (ko) | 얼굴 검출 기능을 사용한 얼굴 및 피부의 자동 미화 | |
| WO2022161009A1 (zh) | 图像处理方法及装置、存储介质、终端 | |
| JP4461789B2 (ja) | 画像処理装置 | |
| JP5463866B2 (ja) | 画像処理装置および画像処理方法、並びにプログラム | |
| KR101590868B1 (ko) | 피부색을 보정하는 영상 처리 방법, 장치, 디지털 촬영 장치, 및 컴퓨터 판독가능 저장매체 | |
| US9135726B2 (en) | Image generation apparatus, image generation method, and recording medium | |
| CN111127591B (zh) | 图像染发处理方法、装置、终端和存储介质 | |
| WO2018176925A1 (zh) | Hdr图像的生成方法及装置 | |
| WO2009102514A1 (en) | Adjusting color attribute of an image in a non-uniform way | |
| CN114663950B (zh) | 低照度的人脸检测方法、装置、计算机设备及存储介质 | |
| CN113723385A (zh) | 视频处理方法及装置、神经网络的训练方法及装置 | |
| US20170154437A1 (en) | Image processing apparatus for performing smoothing on human face area | |
| CN112686800B (zh) | 图像处理方法、装置、电子设备及存储介质 | |
| WO2022052862A1 (zh) | 图像的边缘增强处理方法及应用 | |
| CN114862729A (zh) | 图像处理方法、装置、计算机设备和存储介质 | |
| CN107705279A (zh) | 实现双重曝光的图像数据实时处理方法及装置、计算设备 | |
| JP2009251634A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
| JP6376673B2 (ja) | 画像処理装置 | |
| JP2007241424A (ja) | 画像処理装置および画像処理方法 | |
| CN113421197A (zh) | 一种美颜图像的处理方法及其处理系统 | |
| JP6627530B2 (ja) | 画像処理装置及びプログラム | |
| US9563940B2 (en) | Smart image enhancements | |
| JP6603722B2 (ja) | 画像処理装置およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16772158 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017509480 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15563515 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16772158 Country of ref document: EP Kind code of ref document: A1 |