CN113052754B - Method and device for blurring picture background - Google Patents
Method and device for blurring picture background Download PDFInfo
- Publication number
- CN113052754B CN113052754B CN201911369473.8A CN201911369473A CN113052754B CN 113052754 B CN113052754 B CN 113052754B CN 201911369473 A CN201911369473 A CN 201911369473A CN 113052754 B CN113052754 B CN 113052754B
- Authority
- CN
- China
- Prior art keywords
- background
- light spot
- blurring
- image
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The application is applicable to the technical field of image processing, and provides a method for blurring a picture background, which comprises the following steps: acquiring an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information; blurring the background image in the original image based on the depth image information to obtain a background blurring image; determining light spot parameters corresponding to light spots meeting preset conditions based on the background image; generating a spot map based on the spot parameters; and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture. According to the scheme, the brightness value, the position and the size of the round light spot meeting the conditions are obtained in a bright spot detection mode, the light spot is generated into a light spot image, the light spot and the background blurring image are overlapped in a fusion mode, the light spot special effect is added to the background blurring, the visual effect of the background blurring is improved, and the memory occupation and the algorithm time overhead are reduced.
Description
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a method and an apparatus for blurring a picture background.
Background
The background blurring function in the mobile phone photographing function becomes a standard configuration at present, and the background blurring picture with the facula special effect can enhance the aesthetic feeling and the expressive force.
At present, the existing realization mode of the special effect of background blurring facula is to perform pixel brightness lifting, fuzzy filtering and brightness contraction; the lifting function is to increase the brightness of the bright spot, form a light spot effect around the bright spot after filtering, and shrink the bright spot value to a normal value range.
However, in the conventional background blurring method, since the brightness increase of the pixel point affects the values of the adjacent pixels in the filtering operation, a large color difference is generated in the background after the shrinking operation, the aesthetic feeling of the background blurring is damaged by the color difference effect, the visual effect is affected, and the pixel brightness increase doubles the memory occupied by the pixel value, and the calculation time is increased by the pixel brightness increase and the shrinking operation.
Disclosure of Invention
The embodiment of the application provides a method and a device for blurring a picture background, which can solve the problems that in the existing blurring method, the color difference effect can destroy the aesthetic feeling of the blurring of the background, the visual effect is influenced, and the calculation time is increased by pixel-by-pixel brightness raising and shrinking operations.
In a first aspect, an embodiment of the present application provides a method for blurring a background of an image, including:
acquiring an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information;
blurring the background image in the original image based on the depth image information to obtain a background blurring image;
determining light spot parameters corresponding to light spots meeting first preset conditions based on the background image;
generating a spot map based on the spot parameters;
and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
Further, the depth map information includes a depth value of each pixel point;
the blurring the background image in the original picture based on the depth image information to obtain a background blurring image, including:
and performing background blurring processing on the original picture based on the depth value of each pixel point to obtain a background blurring picture of the original picture.
Further, the determining, based on the background image, a light spot parameter corresponding to a light spot meeting a first preset condition includes:
carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image;
searching a target light spot profile of a target light spot meeting the first preset condition in the binaryzation background image;
and determining the spot parameters of the target spot based on the information of the target spot profile.
Further, the shape of the target light spot profile is circular;
the determining of the spot parameters of the target spot based on the information of the target spot profile comprises:
acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold;
calculating the standard deviation of the distances between all points on the target light spot profile and the central point;
screening out a first standard deviation meeting a third preset condition from the standard deviations under the condition that the radius information meets a second preset condition;
generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation.
Further, the generating a spot map based on the spot parameters includes:
determining generation parameters for generating a spot map based on the spot parameters; the generation parameters comprise pixel values and side length information corresponding to the light spot pattern;
and generating the light spot graph based on the pixel values and the side length information.
Further, the fusing the background blurring image and the spot image to obtain a target background blurring image corresponding to the original image includes:
determining a fusion coefficient based on the pixel values;
and carrying out fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
Further, the acquiring an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information includes:
acquiring an original picture and depth image information corresponding to the original picture;
and carrying out binarization processing on the depth image information to obtain a background image corresponding to the original image.
In a second aspect, an embodiment of the present application provides an apparatus for blurring a background of an image, including:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an original picture and depth image information corresponding to the original picture and extracting a background image corresponding to the original picture based on the depth image information;
the first processing unit is used for blurring the background image in the original image based on the depth image information to obtain a background blurring image;
the first determining unit is used for determining light spot parameters corresponding to light spots meeting a first preset condition based on the background image;
the generating unit is used for generating a spot diagram based on the spot parameters;
and the second processing unit is used for carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
Further, the depth map information includes a depth value of each pixel point, and the first processing unit is specifically configured to:
and performing background blurring processing on the original picture based on the depth value of each pixel point to obtain a background blurring picture of the original picture.
Further, the first determination unit includes:
the third processing unit is used for carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image;
the searching unit is used for searching a target light spot outline of a target light spot meeting the first preset condition in the binaryzation background image;
and the second determining unit is used for determining the spot parameters of the target spot based on the information of the target spot profile.
Further, the shape of the target light spot profile is circular;
the second determining unit is specifically configured to:
acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold;
calculating the standard deviation of the distances between all points on the target light spot profile and the central point;
screening out a first standard deviation meeting a third preset condition from the standard deviations under the condition that the radius information meets a second preset condition;
generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation.
Further, the generating unit is specifically configured to:
determining generation parameters for generating a spot map based on the spot parameters; the generation parameters comprise pixel values and side length information corresponding to the light spot diagram;
and generating the light spot graph based on the pixel values and the side length information.
Further, the second processing unit is specifically configured to:
determining a fusion coefficient based on the pixel values;
and carrying out fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
Further, the obtaining unit is specifically configured to:
acquiring an original picture and depth image information corresponding to the original picture;
and carrying out binarization processing on the depth image information to obtain a background image corresponding to the original image.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the method for blurring the background of a picture as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the method for blurring a picture background as described in the first aspect is implemented.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the method for blurring a background of a picture as described in the first aspect. It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
In the embodiment of the application, an original picture and depth image information corresponding to the original picture are obtained, and a background image corresponding to the original picture is extracted based on the depth image information; blurring the background image in the original image based on the depth image information to obtain a background blurring image; determining light spot parameters corresponding to light spots meeting preset conditions based on the background image; generating a spot map based on the spot parameters; and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture. According to the scheme, the brightness value, the position and the size of the round light spot meeting the conditions are obtained in a bright spot detection mode, the light spot is generated into a light spot image, the light spot and the background blurring image are overlapped in a fusion mode, the light spot special effect is added to the background blurring, the visual effect of the background blurring is improved, and the memory occupation and the algorithm time overhead are reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart illustrating a method for blurring a background of a picture according to a first embodiment of the present application;
fig. 2 is a schematic flowchart of a refinement of S101 in a method for blurring a background of an image according to a first embodiment of the present application;
fig. 3 is a schematic diagram of a background blurring diagram in a method for blurring a background of an image according to a first embodiment of the present application;
fig. 4 is a schematic diagram of a speckle pattern in a method for blurring a picture background according to a first embodiment of the present application;
fig. 5 is a schematic diagram of a target background blurring diagram corresponding to an original picture in a method for blurring a picture background according to a first embodiment of the present application;
fig. 6 is a flowchart illustrating another method for blurring a background of a picture according to a second embodiment of the present application;
fig. 7 is a schematic flowchart of a refinement at S205 in another method for blurring a background of a picture according to a second embodiment of the present application;
fig. 8 is a flowchart illustrating another method for blurring a background of a picture according to a third embodiment of the present application;
fig. 9 is a flowchart illustrating another method for blurring a background of a picture according to a fourth embodiment of the present application;
fig. 10 is a schematic diagram of an apparatus for blurring a background of a picture according to a fifth embodiment of the present application;
fig. 11 is a schematic diagram of an apparatus for blurring a background of a picture according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating a method for blurring a background of a picture according to a first embodiment of the present application. The execution main body of the method for blurring the picture background in this embodiment is a device having a function of blurring the picture background, and may specifically be a device such as a mobile terminal, a computer, or a server. The method for blurring the background of the picture as shown in fig. 1 may include:
s101: the method comprises the steps of obtaining an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information.
The device obtains an original picture and depth image information corresponding to the original picture, wherein the original picture is an unprocessed image collected by an image collecting device. When the device has an image capture function, the original picture may be taken by the device. And after the original picture is obtained, obtaining depth image information corresponding to the original picture. Depth images (depth images), also known as range images, refer to images that take as pixel values the distance (depth) from an image grabber to each point in a scene, which directly reflects the geometry of the visible surface of the scene. The depth image can be calculated into point cloud data through coordinate conversion, and the point cloud data with regular and necessary information can also be inversely calculated into depth image data. In the image frames provided by the depth data stream, each pixel represents the distance (in millimeters) to the plane of the camera from the object closest to the plane at that particular (x, y) coordinate in the field of view of the depth sensor. At present, the depth image acquisition method includes a laser radar depth imaging method, a computer stereoscopic vision imaging method, a coordinate measuring machine method, a moire fringe method, a structured light method, and the like, and the depth image information acquisition method is not limited in this embodiment.
The device extracts a background image corresponding to the original picture based on the depth image information, the device can extract a foreground image and a background image from the depth image information corresponding to the original picture, the foreground image can be understood as a subject part of the whole picture, and for example, a figure image in a shot figure picture is a foreground image; the background image is the non-subject part of the whole picture, namely the image needing blurring.
Further, in order to efficiently and accurately extract the background image corresponding to the original picture, S101 may include S1011 to S1012, as shown in fig. 2, S1011 to S1012 specifically include the following:
s1011: and acquiring an original picture and depth image information corresponding to the original picture.
The step is the same as the step in S101, and reference may be specifically made to the related description in S101, which is not described herein again.
S1012: and carrying out binarization processing on the depth image information to obtain a background image corresponding to the original image.
The equipment carries out binarization processing on the depth image information, wherein the binarization processing of the image is to set the gray scale of a point on the image to be 0 or 255, namely the whole image presents obvious black and white effect. That is, the 256 brightness level gray scale image is selected by a proper threshold value to obtain a binary image which can still reflect the whole and local features of the image. In digital image processing, a binary image plays a very important role, and particularly in practical image processing, there are many systems configured by implementing binary image processing, and when processing and analysis of a binary image are performed, a gray-scale image is first binarized to obtain a binarized image, which is advantageous for further processing of the image, the aggregate properties of the image are only related to the positions of points having pixel values of 0 or 255, and multi-level values of pixels are not involved, so that the processing is simplified, and the processing and compression amount of data is small. In order to obtain an ideal binary image, a non-overlapping region is generally defined by closed and connected boundaries. All pixels with the gray levels larger than or equal to the threshold are judged to belong to the specific object, the gray level of the pixels is 255 for representation, otherwise the pixels are excluded from the object region, the gray level is 0, and the pixels represent the background or the exceptional object region. If a particular object has a uniform gray level inside it and is in a uniform background with gray levels of other levels, a comparable segmentation result can be obtained using the thresholding method. If the difference between the object and the background is not represented in gray scale values (e.g., different textures), the difference feature can be converted into a gray scale difference, and then the image can be segmented using a threshold selection technique. That is, by performing binarization processing on the depth image information, a foreground picture and a background picture corresponding to the depth image information can be distinguished quickly and efficiently. The device carries out binarization processing on the depth image information to obtain a background image corresponding to the original image.
S102: and blurring the background image in the original image based on the depth image information to obtain a background blurring image.
The device performs blurring processing on the background image in the original image based on the depth image information to obtain a background blurring image, wherein the background blurring image is an image with shallow depth of field and focus on a subject. The device obtains blurring parameters needed by blurring the background image based on the depth image information, and performs blurring on the background image based on the blurring parameters to obtain a background blurring image. As shown in fig. 3, fig. 3 is a background blurring diagram, the subject of the picture is a character, and the background is blurred to obtain the background blurring diagram.
Further, the depth map information includes a depth value of each pixel point, and in order to accurately obtain the background blurring map, S102 may include: and performing background blurring processing on the original picture based on the depth value of each pixel point to obtain a background blurring diagram of the original picture. The depth map information comprises a depth value of each pixel point, the device obtains the depth value of each pixel point based on the depth image information, can determine the depth value of the pixel point in the background image and the depth value of the pixel point in the foreground image, can determine a depth value difference value between the depth value of the pixel point in the background image and the depth value of the pixel point in the foreground image based on the depth value of the pixel point in the background image and the depth value of the pixel point in the foreground image, can determine a blurring parameter required when the background image is blurred based on the depth value of the pixel point in the background image, the depth value and the depth value difference value of the pixel point in the foreground image, the blurring parameter can comprise a blurring parameter of each pixel point in the original image, and blurring is performed on each pixel point in the original image based on the blurring parameter to obtain the background blurring map.
S103: and determining light spot parameters corresponding to light spots meeting a first preset condition based on the background image.
The background image comprises a plurality of light spots, a first preset condition for screening the light spots is preset in the device, light spot parameters corresponding to the light spots meeting the preset condition are determined based on the background image, the light spots in the background image are screened according to the first preset condition, the light spots meeting the first preset condition and the light spot parameters corresponding to the light spots are obtained, the light spot parameters are used for generating a light spot diagram, and the light spot parameters comprise pixel information, size information and the like of the light spot diagram.
S104: and generating a spot pattern based on the spot parameters.
The device obtains spot parameters for generating a spot pattern, wherein the schematic diagram of the spot pattern is shown in fig. 4, and the circle in fig. 5 is the spot. The spot parameters may include radius information of the circular spot and pixel values of the circular spot, and the device generates a spot map based on the spot parameters. And the light spot image is used for being fused with the background blurring image to obtain a final target background blurring image corresponding to the original image.
S105: and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
The device performs fusion processing on the background blurring graph and the spot graph to obtain a target background blurring graph corresponding to the original graph, and the specific fusion method may be to perform fusion calculation on pixel parameters of the background blurring graph and the spot graph, or may be to use the spot graph to add a spot special effect to the background blurring graph, as shown in fig. 5, as for fig. 5, compared with fig. 3, a circular spot is added to the blurring graph, and the circular spot in the picture background is an effect of fusing the spot graph to the background blurring graph.
In the embodiment of the application, an original picture and depth image information corresponding to the original picture are obtained, and a background image corresponding to the original picture is extracted based on the depth image information; blurring the background image in the original image based on the depth image information to obtain a background blurring image; determining light spot parameters corresponding to light spots meeting preset conditions based on the background image; generating a spot map based on the spot parameters; and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture. According to the scheme, the brightness value, the position and the size of the round light spot meeting the conditions are obtained in a bright spot detection mode, the light spot is generated into a light spot image, the light spot and the background blurring image are overlapped in a fusion mode, the light spot special effect is added to the background blurring, the visual effect of the background blurring is improved, and the memory occupation and the algorithm time overhead are reduced.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating another method for blurring a background of a picture according to a second embodiment of the present application. The main execution body of the method for blurring the picture background in this embodiment is a device having a function of blurring the picture background, and may specifically be a mobile terminal, a computer, a server, and other devices. In order to efficiently and accurately acquire the spot parameters and improve the effect of background blurring, the present embodiment differs from the first embodiment in S203 to S205, where S201 to S202 in the present embodiment are the same as S101 to S102 in the first embodiment, S206 to S207 in the present embodiment are the same as S104 to S105 in the first embodiment, and S203 to S205 in the present embodiment are further detailed in S103. As shown in fig. 6, S203 to S205 are specifically as follows:
s203: and carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image.
The method comprises the steps of presetting a brightness threshold value in the device, wherein the preset brightness threshold value is used for carrying out binarization processing on a gray level image of a background image to obtain a binarization background image corresponding to the background image. In actual practice, the brightness threshold is selected as 240, and those skilled in the art will appreciate that 240 is not the only choice of brightness threshold and does not constitute a limitation on the value of the brightness threshold. The device acquires a gray scale map, also called a gray scale map, of the background image, and divides white and black into a plurality of levels according to a logarithmic relationship, called gray scales. The gray scale is divided into 256 steps, and an image represented by gray scale is called a gray scale map. And the equipment carries out binarization processing on the gray level image of the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image. The specific process of the binarization processing can refer to the relevant description in S1012, and is not described herein again.
S204: and searching the target light spot contour of the target light spot meeting the first preset condition in the binary background image.
The device is preset with a first preset condition, the first preset condition is used for screening out the target light spot outline of the target light spot, and the target light spot is the light spot of the target background blurred image obtained by fusing the generated light spot parameters with the background blurred image. The device searches the target light spot contour of the target light spot meeting the first preset condition in the binary background image by contour searching. The contour is a boundary or a contour line forming any shape, and a contour searching function is preset in the device, for example, a FindContours method of Image < TColor, TDepth > type can conveniently search the contour, but before searching, a color Image needs to be converted into a gray Image, and then the gray Image needs to be converted into a binary Image. The equipment finds the light spot contour in the binaryzation background image through the contour searching function, then screens the light spot contour based on the first preset condition, and finds the light spot contour of the target light spot meeting the first preset condition.
S205: and determining the spot parameters of the target spot based on the information of the target spot profile.
After the device acquires the target light spot profile, the device acquires information of the target light spot profile, wherein the information of the target light spot profile can include radius information of the target light spot, diameter information of the target light spot, the number of pixel points included in the target light spot, pixel values of the target light spot and the like, and the device determines light spot parameters of the target light spot based on the information of the target light spot profile.
Further, the shape of the target light spot profile is circular, the number of the target light spot profiles is at least two, and S205 may include S2051 to S2054 for efficiently and accurately obtaining the light spot parameters and improving the effect of background blurring, as shown in fig. 7, S2051 to S2054 are specifically as follows:
s2051: and acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold.
The device determines the number of pixel points in the target light spot profile based on the information of the target light spot profile, and a preset number threshold minPoints is preset in the device, for example, the preset number threshold minPoints can be 6, and the preset number threshold is used for screening the target light spot profile. The device compares the number of pixel points in the target light spot profile with a preset number threshold minPoints, and acquires a target circumscribed circle corresponding to the target light spot profile under the condition that the number of the pixel points in the target light spot profile is greater than the preset number threshold, wherein the target circumscribed circle is a minimum circumscribed circle of the target light spot profile, and acquires radius information, a center point and center point information of the target circumscribed circle, the radius information is radius length, and the center point information is circle center coordinate information of the target circumscribed circle.
S2052: and calculating the standard deviation of the distances between all points on the target light spot profile and the central point.
The device may label a point on the target spot profile as (x, y) and a point on the target spot profile as (x, y) based on the standard deviation of the distances between all points on the target spot profile and the center point0,y0) Then (x, y) and (x) are calculated0,y0) The Standard Deviation, of the distance between two points, also commonly referred to as mean square error in the Chinese context, is the square root of the arithmetic mean squared from the mean square, expressed as σ. The standard deviation is the arithmetic square root of the variance, and reflects the degree of dispersion of a data set. The standard deviation is not necessarily the same for two sets of data with the same mean.
S2053: and under the condition that the radius information meets a second preset condition, screening out a first standard deviation meeting a third preset condition from the standard deviations.
The device is preset with a second preset condition, the second preset condition is used for judging whether the radius information can be used as a light spot parameter, specifically, the second preset condition can be set to be a numerical value interval, when the radius information is located in the numerical value interval, the second preset condition is met, for example, the second preset condition can be that r > minR and r < ═ maxR, where r is the radius information, minR-maxR are numerical value intervals in the second preset condition, and minR-maxR can be set to be 5-8. Under the condition that the radius information meets a second preset condition, screening a first standard deviation meeting a third preset condition from the standard deviations, wherein the third preset condition is used for judging the standard deviation which can be used as the light spot parameter, specifically, when the standard deviation is smaller than a certain value, the standard deviation is recorded as the first standard deviation and can be used as the light spot parameter, for example, the third preset condition may be that var < maxVar, maxVar may be set to 8, var is the standard deviation, and maxVar is a threshold value in the third preset condition.
S2054: generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation.
The device generates a spot parameter of the target spot based on the radius information and the target center point information. And the target center point information is the center point information of the center point corresponding to the first standard deviation. And the equipment judges that the radius information meets the second preset condition, screens out the first standard deviation meeting the third preset condition, and acquires the center point information of the center point corresponding to the first standard deviation, namely the coordinate information of the center point corresponding to the first standard deviation. For example, when the center point information of the center point corresponding to the first standard deviation is (x)0,y0) And the radius information is r, the spot parameter of the target spot can be recorded as (x)0,y0,r)。
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating another method for blurring a background of a picture according to a third embodiment of the present application. The main execution body of the method for blurring the picture background in this embodiment is a device having a function of blurring the picture background, and may specifically be a mobile terminal, a computer, a server, and other devices. In order to accurately generate the speckle pattern and improve the effect of background blurring, the present embodiment is different from the first embodiment in S304 to S305, where S301 to S303 in the present embodiment are the same as S101 to S103 in the first embodiment, S306 in the present embodiment is the same as S105 in the first embodiment, and S304 to S305 are further refinements of S104. As shown in fig. 8, S304 to S305 are specifically as follows:
s304: determining generation parameters for generating a spot map based on the spot parameters; the generation parameters comprise pixel values and side length information corresponding to the light spot pattern.
In this embodiment, the spot map may be a square, the pixel value and the side length information corresponding to the spot map are determined based on the spot parameter, the side length information is the side length of the square spot map, and the side length information is determined based on the radius information, for example, if the radius information is r, the side length information is 2r +1 mm. The pixel value corresponding to the spot diagram is the pixel value which can be filled by the maximum inscribed circle of the square spot diagram.
S305: and generating the light spot graph based on the pixel values and the side length information.
And determining parameters of the light spot diagram based on the pixel values and the side length information to generate a square light spot diagram.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating another method for blurring a background of a picture according to a fourth embodiment of the present application. The execution subject of the method for blurring the picture background in this embodiment is a device having a function of blurring the picture background, which may be a mobile terminal, a computer, a server, or other devices. In order to better fuse the background blurring map and the flare map and improve the fusion effect, the difference between the present embodiment and the third embodiment is S406 to S407, where S401 to S405 in the present embodiment are the same as S301 to S305 in the third embodiment, and S406 to S407 are further refinements of S306, as shown in fig. 5, S406 to S407 are specifically as follows:
s406: a fusion coefficient is determined based on the pixel values.
The device determines a fusion coefficient based on the pixel value, the fusion coefficient is used for performing fusion processing on the background blurring graph and the flare graph, the device presets a relationship between the pixel value and the fusion coefficient, for example, when the pixel value is pixelVal, the device determines the fusion coefficient a to be pixelVal/512 based on the pixel value, and the device determines the fusion coefficient based on the relationship between the preset pixel value and the fusion coefficient and the pixel value.
S407: and carrying out fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
The device is preset with a fusion strategy, which is used for performing fusion processing on the background blurring map and the spot map, for example, the preset fusion strategy may be a × lightSpot + (1-a) × Bokeh, where a is a fusion coefficient, lightSpot is the spot map, and Bokeh is the background blurring map. And performing fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Referring to fig. 10, fig. 10 is a schematic diagram of an apparatus for blurring a background of a picture according to a fifth embodiment of the present application. The units are included for executing the steps in the embodiments corresponding to fig. 1-2 and 6-9. Please specifically refer to the related descriptions in the corresponding embodiments of fig. 1 to fig. 2 and fig. 6 to fig. 9. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 10, the apparatus 10 for blurring a picture background includes:
an obtaining unit 1010, configured to obtain an original picture and depth image information corresponding to the original picture, and extract a background image corresponding to the original picture based on the depth image information;
a first processing unit 1020, configured to perform blurring processing on the background image in the original picture based on the depth image information to obtain a background blurring map;
a first determining unit 1030, configured to determine, based on the background image, a light spot parameter corresponding to a light spot meeting a first preset condition;
a generating unit 1040, configured to generate a spot map based on the spot parameter;
the second processing unit 1050 is configured to perform fusion processing on the background blurring map and the spot map to obtain a target background blurring map corresponding to the original picture.
Further, the depth map information includes a depth value of each pixel point, and the first processing unit 1020 is specifically configured to:
and performing background blurring processing on the original picture based on the depth value of each pixel point to obtain a background blurring picture of the original picture.
Further, the first determining unit 1030 includes:
the third processing unit is used for carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image;
the searching unit is used for searching a target light spot outline of a target light spot meeting the first preset condition in the binaryzation background image;
and the second determining unit is used for determining the spot parameters of the target spot based on the information of the target spot profile.
Further, the shape of the target light spot profile is circular;
the second determining unit is specifically configured to:
acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold;
calculating the standard deviation of the distances between all points on the target light spot profile and the central point;
screening out a first standard deviation meeting a third preset condition from the standard deviations under the condition that the radius information meets a second preset condition;
generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation.
Further, the generating unit 1040 is specifically configured to:
determining generation parameters for generating a spot map based on the spot parameters; the generation parameters comprise pixel values and side length information corresponding to the light spot pattern;
and generating the light spot graph based on the pixel values and the side length information.
Further, the second processing unit 1050 is specifically configured to:
determining a fusion coefficient based on the pixel values;
and carrying out fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
Further, the obtaining unit 1010 is specifically configured to:
acquiring an original picture and depth image information corresponding to the original picture;
and carrying out binarization processing on the depth image information to obtain a background image corresponding to the original image.
Fig. 11 is a schematic diagram of an apparatus for blurring a background of a picture according to a sixth embodiment of the present application. As shown in fig. 11, the apparatus 11 for blurring a picture background of this embodiment includes: a processor 110, a memory 111 and a computer program 112, such as a picture background blurring program, stored in said memory 111 and executable on said processor 110. The processor 110, when executing the computer program 112, implements the steps in the above-mentioned device method embodiments for blurring the background of the picture, such as the steps 101 to 105 shown in fig. 1. Alternatively, the processor 110, when executing the computer program 112, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 1010 to 1050 shown in fig. 10.
Illustratively, the computer program 112 may be partitioned into one or more modules/units that are stored in the memory 111 and executed by the processor 110 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 112 in the picture background blurring device 11. For example, the computer program 112 may be divided into an acquisition unit, a first processing unit, a first determination unit, a generation unit, and a second processing unit, and the specific functions of each unit are as follows:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an original picture and depth image information corresponding to the original picture and extracting a background image corresponding to the original picture based on the depth image information;
the first processing unit is used for blurring the background image in the original image based on the depth image information to obtain a background blurring image;
the first determining unit is used for determining light spot parameters corresponding to light spots meeting a first preset condition based on the background image;
the generating unit is used for generating a spot diagram based on the spot parameters;
and the second processing unit is used for carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
The picture background blurring device may include, but is not limited to, a processor 110 and a memory 111. It will be appreciated by those skilled in the art that fig. 11 is merely an example of the device 11 for blurring picture background, and does not constitute a limitation of the device 11 for blurring picture background, and may include more or less components than those shown, or some components may be combined, or different components, for example, the device for blurring picture background may further include an input-output device, a network access device, a bus, etc.
The Processor 110 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 111 may be an internal storage unit of the device 11 for blurring picture backgrounds, such as a hard disk or a memory of the device 11 for blurring picture backgrounds. The memory 111 may also be an external storage device of the picture background blurring device 11, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the picture background blurring device 11. Further, the memory 111 may also include both an internal storage unit of the device 11 for blurring the background of the picture and an external storage device. The memory 111 is used for storing the computer program and other programs and data required by the picture background blurring device 11. The memory 111 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the apparatus may be divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (8)
1. A method for blurring a background of a picture, comprising:
acquiring an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information;
blurring the background image in the original image based on the depth image information to obtain a background blurring image;
determining light spot parameters corresponding to light spots meeting first preset conditions based on the background image; determining light spot parameters corresponding to light spots meeting a first preset condition based on the background image, wherein the determining comprises: carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image; searching a target light spot profile of a target light spot meeting the first preset condition in the binaryzation background image; determining the light spot parameters of the target light spots based on the information of the target light spot profile; wherein the shape of the target light spot profile is circular; the determining of the spot parameters of the target spot based on the information of the target spot profile comprises: acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold; calculating the standard deviation of the distances between all points on the target light spot profile and the central point; screening out a first standard deviation meeting a third preset condition from the standard deviations under the condition that the radius information meets a second preset condition; generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation;
generating a spot map based on the spot parameters;
and carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
2. The method of claim 1, wherein the depth image information comprises a depth value for each pixel point;
the blurring the background image in the original picture based on the depth image information to obtain a background blurring image, including:
and performing background blurring processing on the original picture based on the depth value of each pixel point to obtain a background blurring picture of the original picture.
3. The method of claim 1, wherein generating a spot pattern based on the spot parameters comprises:
determining generation parameters for generating a spot map based on the spot parameters; the generation parameters comprise pixel values and side length information corresponding to the light spot pattern;
and generating the light spot graph based on the pixel values and the side length information.
4. The method of claim 3, wherein the fusing the background blurring map and the flare map to obtain a target background blurring map corresponding to the original picture comprises:
determining a fusion coefficient based on the pixel values;
and carrying out fusion processing on the background blurring graph and the light spot graph based on the fusion coefficient to obtain a target background blurring graph corresponding to the original picture.
5. The method of any one of claims 1-4, wherein the obtaining an original picture and depth image information corresponding to the original picture, and extracting a background image corresponding to the original picture based on the depth image information comprises:
acquiring an original picture and depth image information corresponding to the original picture;
and carrying out binarization processing on the depth image information to obtain a background image corresponding to the original image.
6. An apparatus for blurring a background of a picture, comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an original picture and depth image information corresponding to the original picture and extracting a background image corresponding to the original picture based on the depth image information;
the first processing unit is used for blurring the background image in the original image based on the depth image information to obtain a background blurring image;
the first determining unit is used for determining light spot parameters corresponding to light spots meeting a first preset condition based on the background image; wherein, the determining of the light spot parameters corresponding to the light spots meeting the first preset condition based on the background image includes: carrying out binarization processing on the background image based on a preset brightness threshold value to obtain a binarization background image corresponding to the background image; searching a target light spot profile of a target light spot meeting the first preset condition in the binaryzation background image; determining the light spot parameters of the target light spots based on the information of the target light spot profile; wherein the shape of the target light spot profile is circular; the determining the spot parameters of the target spot based on the information of the target spot profile includes: acquiring radius information, a central point and central point information of a target circumscribed circle corresponding to the target light spot profile under the condition that the number of pixel points in the target light spot profile is greater than a preset number threshold; calculating the standard deviation of the distances between all points on the target light spot profile and the central point; screening out a first standard deviation meeting a third preset condition from the standard deviations under the condition that the radius information meets a second preset condition; generating a light spot parameter of the target light spot based on the radius information and the target central point information; the target center point information is center point information of a center point corresponding to the first standard deviation;
the generating unit is used for generating a spot diagram based on the spot parameters;
and the second processing unit is used for carrying out fusion processing on the background blurring graph and the light spot graph to obtain a target background blurring graph corresponding to the original picture.
7. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911369473.8A CN113052754B (en) | 2019-12-26 | 2019-12-26 | Method and device for blurring picture background |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911369473.8A CN113052754B (en) | 2019-12-26 | 2019-12-26 | Method and device for blurring picture background |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113052754A CN113052754A (en) | 2021-06-29 |
| CN113052754B true CN113052754B (en) | 2022-06-07 |
Family
ID=76505568
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201911369473.8A Active CN113052754B (en) | 2019-12-26 | 2019-12-26 | Method and device for blurring picture background |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113052754B (en) |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113658241B (en) * | 2021-08-16 | 2022-12-16 | 合肥的卢深视科技有限公司 | Monocular structured light depth recovery method, electronic device and storage medium |
| CN114372931B (en) * | 2021-12-31 | 2025-03-11 | 原力图新(重庆)科技有限公司 | A method, device, storage medium and electronic device for blurring a target object |
| CN117693767A (en) * | 2022-06-20 | 2024-03-12 | 北京小米移动软件有限公司 | Image processing method and device, electronic equipment and storage medium |
| WO2023245364A1 (en) * | 2022-06-20 | 2023-12-28 | 北京小米移动软件有限公司 | Image processing method and apparatus, electronic device, and storage medium |
| CN116977804A (en) * | 2023-05-26 | 2023-10-31 | 北京迈格威科技有限公司 | Image fusion method, electronic device, storage medium and computer program product |
| CN118509715B (en) * | 2024-05-14 | 2025-01-28 | 荣耀终端有限公司 | Light spot display method, device and electronic equipment |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103945118A (en) * | 2014-03-14 | 2014-07-23 | 华为技术有限公司 | Picture blurring method and device and electronic equipment |
| US9412170B1 (en) * | 2015-02-25 | 2016-08-09 | Lite-On Technology Corporation | Image processing device and image depth processing method |
| WO2017012418A1 (en) * | 2015-07-21 | 2017-01-26 | 深圳Tcl数字技术有限公司 | Image processing method and apparatus |
| CN106683147A (en) * | 2017-01-23 | 2017-05-17 | 浙江大学 | Method of image background blur |
| CN106993112A (en) * | 2017-03-09 | 2017-07-28 | 广东欧珀移动通信有限公司 | Background virtualization method and device based on depth of field and electronic device |
| CN107977940A (en) * | 2017-11-30 | 2018-05-01 | 广东欧珀移动通信有限公司 | background blurring processing method, device and equipment |
| CN108399596A (en) * | 2018-02-07 | 2018-08-14 | 深圳奥比中光科技有限公司 | Depth image engine and depth image computational methods |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106504220B (en) * | 2016-08-19 | 2019-07-23 | 华为机器有限公司 | A kind of image processing method and device |
| TWI689892B (en) * | 2018-05-18 | 2020-04-01 | 瑞昱半導體股份有限公司 | Background blurred method and electronic apparatus based on foreground image |
-
2019
- 2019-12-26 CN CN201911369473.8A patent/CN113052754B/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103945118A (en) * | 2014-03-14 | 2014-07-23 | 华为技术有限公司 | Picture blurring method and device and electronic equipment |
| US9412170B1 (en) * | 2015-02-25 | 2016-08-09 | Lite-On Technology Corporation | Image processing device and image depth processing method |
| WO2017012418A1 (en) * | 2015-07-21 | 2017-01-26 | 深圳Tcl数字技术有限公司 | Image processing method and apparatus |
| CN106683147A (en) * | 2017-01-23 | 2017-05-17 | 浙江大学 | Method of image background blur |
| CN106993112A (en) * | 2017-03-09 | 2017-07-28 | 广东欧珀移动通信有限公司 | Background virtualization method and device based on depth of field and electronic device |
| CN107977940A (en) * | 2017-11-30 | 2018-05-01 | 广东欧珀移动通信有限公司 | background blurring processing method, device and equipment |
| CN108399596A (en) * | 2018-02-07 | 2018-08-14 | 深圳奥比中光科技有限公司 | Depth image engine and depth image computational methods |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113052754A (en) | 2021-06-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN113052754B (en) | Method and device for blurring picture background | |
| CN110766679B (en) | Lens contamination detection method and device and terminal equipment | |
| CN108323204B (en) | Method for detecting face flaw point and intelligent terminal | |
| CN109146855B (en) | Image moire detection method, terminal device and storage medium | |
| CN110852997B (en) | Dynamic image definition detection method and device, electronic equipment and storage medium | |
| CN110059702B (en) | Object contour recognition method and device | |
| CN115908269A (en) | Visual defect detection method and device, storage medium and computer equipment | |
| CN113077459B (en) | Image definition detection method and device, electronic equipment and storage medium | |
| CN110909640A (en) | Method and device for determining water level line, storage medium and electronic device | |
| CN112464829B (en) | Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system | |
| CN113569713B (en) | Video image stripe detection method and device, and computer readable storage medium | |
| CN109002823B (en) | Region-of-interest determining method, device, equipment and readable storage medium | |
| CN111630569B (en) | Binocular matching method, visual imaging device and device with storage function | |
| CN108182666B (en) | Parallax correction method, device and terminal | |
| CN114677394A (en) | Matting method, matting device, image pickup apparatus, conference system, electronic apparatus, and medium | |
| CN117635615B (en) | Defect detection method and system for realizing punching die based on deep learning | |
| CN116249015A (en) | Camera occlusion detection method, device, camera equipment and storage medium | |
| CN117745552A (en) | Self-adaptive image enhancement method and device and electronic equipment | |
| CN119671947A (en) | A highway engineering quality detection method and system based on image recognition | |
| CN119722682B (en) | A method and system for online detection of cable appearance particles | |
| CN113936026A (en) | Image processing method, device, electronic device and storage medium | |
| CN114266903B (en) | Root system image processing method | |
| CN108960247A (en) | Image significance detection method, device and electronic equipment | |
| JP2020108060A (en) | Deposit detector and deposit detection method | |
| CN114140481A (en) | Edge detection method and device based on infrared image |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |