CN119767129A - A dynamic image processing method, device, electronic device and storage medium - Google Patents
A dynamic image processing method, device, electronic device and storage medium Download PDFInfo
- Publication number
- CN119767129A CN119767129A CN202411986650.8A CN202411986650A CN119767129A CN 119767129 A CN119767129 A CN 119767129A CN 202411986650 A CN202411986650 A CN 202411986650A CN 119767129 A CN119767129 A CN 119767129A
- Authority
- CN
- China
- Prior art keywords
- parameters
- parameter
- camera
- image
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Color Television Image Signal Generators (AREA)
Abstract
The invention provides a dynamic image processing method and device, which comprises the steps of calling a preset light supplementing lamp array according to current camera parameters and environment light parameters, determining a matching relation between an output image required to be shot and the camera parameters and the environment light parameters according to the camera parameters and the environment light parameters and through the relation between the images and the optical parameters, starting the camera and the light supplementing lamp array, and dynamically adjusting the camera parameters and the light supplementing lamp array parameters according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value. The image output quality can be improved, the adaptability and the flexibility can be enhanced, the application range is enlarged for various complex and changeable shooting environments, the shooting efficiency can be improved, the ideal image effect can be automatically and rapidly achieved, and the post-processing and manual intervention cost is reduced.
Description
Technical Field
The present invention relates to the field of imaging technologies, and in particular, to a method and apparatus for processing a moving image, an electronic device, and a storage medium.
Background
With the continuous development of electronic technology at present, the images of cameras or cameras are increasingly higher and higher, and play an important role in application aspects such as image recognition, vehicle and pedestrian recognition, license plate recognition, robot navigation, special scene photography and the like. In many applications, a light supplementing lamp is generally required to be used for supplementing light, so that a terminal device such as a computer and the like can better process a dynamic image output by a camera, and the output image effect is better.
However, in the prior art, a single light-compensating lamp is generally used for compensating light, and during image processing, image processing can only be performed according to the single light-compensating effect, and an output image cannot be adjusted according to the light-compensating effect, so that a better image output effect is difficult to achieve. Accordingly, the present invention provides a dynamic image processing method, apparatus, electronic device and storage medium, which solve at least some of the above problems.
Disclosure of Invention
The present invention has been made in view of the above-mentioned problems, and has as its object to provide a moving image processing method, apparatus, electronic device, and storage medium that overcome or at least partially solve the above-mentioned problems.
In order to solve the above problems, the present invention discloses a moving image processing method comprising:
invoking a preset light supplementing lamp array according to the current camera parameters and the ambient light parameters;
according to the camera parameters and the ambient light parameters, and through the relation between the images and the optical parameters, determining the matching relation between the output images required to be shot and the camera parameters and the ambient light parameters;
And starting a camera and a light supplementing lamp array, and dynamically adjusting the parameters of the camera and the parameters of the light supplementing lamp array according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value.
Optionally, before the calling the preset light compensating lamp array according to the current camera parameter and the ambient light parameter, the method further includes:
Camera parameters and ambient light parameters are acquired.
Optionally, the acquiring the camera parameter and the ambient light parameter includes:
the method comprises the steps of obtaining camera parameters including an aperture coefficient f, a shutter speed t, a sensitivity ISO, a color correction parameter M and a sharpening parameter L, and obtaining an ambient light parameter including a color temperature parameter and an ambient illuminance parameter in the current environment.
Optionally, the calling the preset light compensating lamp array according to the current camera parameter and the ambient light parameter includes:
Determining a camera parameter value range according to the sensor and lens parameters of the current camera;
And calling the light supplementing lamp array according to the parameter value range of the camera, and determining the environment light parameter range of the combination of the environment light supplementing lamp array and the natural light.
Optionally, the determining, according to the camera parameter and the ambient light parameter and through the relationship between the image and the optical parameter, the matching relationship between the output image required for shooting and the camera parameter and the ambient light parameter includes:
According to the aperture coefficient f, shutter speed t, sensitivity ISO, current camera parameters of the color correction parameters M and sharpening parameters L, and the matching relation between the ambient light parameters including color temperature and ambient illuminance and the output image.
Optionally, the matching relationship between the current camera parameters according to the aperture coefficient f, the shutter speed t, the sensitivity ISO, the color correction parameter M and the sharpening parameter L of the camera, and the ambient light parameter including the color temperature and the ambient illuminance, and the output image includes:
determining the matching relation of exposure compensation, color and white balance existing between the current camera parameters and the current ambient light parameters through an exposure compensation algorithm, a color correction algorithm and a white balance algorithm;
wherein, the white balance algorithm is as follows:
,,,,,,,, In the formula, I (x, y) represents pixels with coordinates (x, y) in the image, the subscript R, G, B of I represents pixel values of three channel information of red, green and blue respectively, 、、Respectively, the average value of pixels of red, green and blue channels in the image, N is the total number of pixels in the image, K is the gray value,、、Gain coefficients for the red, green, blue channels, respectively;
the color correction algorithm formula is as follows:
, M= [ M ij ], where T represents the transpose, M is the color correction matrix, and M ij is the element of the ith row and j columns in the matrix M; is the color vector of the pixel of the original image, A color corrected pixel color vector;
the formula of the exposure compensation algorithm is as follows:
in the formula, E is the exposure quantity, f aperture coefficient, t shutter speed, ISO sensitivity and EV exposure value.
Optionally, the matching relationship between the current camera parameters according to the aperture coefficient f, the shutter speed t, the light sensitivity ISO, the color correction parameter M and the sharpening parameter L of the camera, and the ambient light parameter including the color temperature and the ambient illuminance, and the output image further includes:
According to the current camera parameters and the ambient light parameters, determining that a noise reduction matching relationship and a sharpening matching relationship exist between the current camera parameters and the output image through a noise reduction algorithm and an image sharpening algorithm;
Wherein, the noise reduction algorithm formula:
,
In the formula, I (x, y) represents pixel values with coordinates of (x, y) in the image, I' (x, y) represents pixel values with coordinates of (x, y) in the image after mean filtering, n and m are respectively used for defining radius of a filtering window in x and y directions, and a set of pixel values in an S window;
The image sharpening algorithm has the specific formula:
,
in the formula, I (x, y) represents a pixel value with coordinates (x, y) in the image, I' (x, y) represents a pixel value after sharpening, L is a laplace operator matrix, k is a sharpening intensity coefficient, and l×i represents convolution operation of the laplace operator L and the image I.
A moving image processing apparatus for a moving image processing method, comprising:
the light supplementing and calling module is used for calling a preset light supplementing lamp array according to the current camera parameters and the environment light parameters;
the parameter matching module is used for determining the matching relation between the output image required to be shot and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and through the relation between the image and the optical parameter;
And the parameter adjusting module is used for starting the camera and the light supplementing lamp array, and dynamically adjusting the parameters of the camera and the light supplementing lamp array according to the matching relation until the difference value between the output image parameter and the shooting requirement parameter meets a preset threshold value.
The embodiment of the invention also discloses an electronic device which comprises a processor, a memory and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the dynamic image processing method when being executed by the processor.
The embodiment of the invention also discloses a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program realizes the steps of the dynamic image processing method when being executed by a processor.
The method has the advantages that the preset light supplementing lamp array is called according to the current camera parameters and the environment light parameters, the matching relation between the output image required to be shot and the camera parameters and the environment light parameters is determined according to the camera parameters and the environment light parameters and through the relation between the images and the optical parameters, the camera and the light supplementing lamp array are started, and the camera parameters and the light supplementing lamp array parameters are dynamically adjusted according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets the preset threshold value. The method has the advantages of improving the image output quality, enabling the brightness, contrast and color reproducibility to be better, reducing image defects, meeting the requirements of multiple fields on high-quality images, enhancing adaptability and flexibility, coping with various complex and changeable shooting environments, expanding application range, improving shooting efficiency, automatically and quickly achieving ideal image effects, reducing post-processing and manual intervention costs, having obvious advantages in shooting tasks with high requirements on batch or timeliness, and having higher application and innovation values.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a method for processing dynamic images according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a dynamic image processing apparatus according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, an embodiment of the present invention provides a step flowchart of a dynamic image processing method, including:
S100, calling a preset light supplementing lamp array according to current camera parameters and ambient light parameters;
S200, determining a matching relation between an output image required to be shot and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and through the relation between the image and the optical parameter;
And S300, starting a camera and a light supplementing lamp array, and dynamically adjusting the camera parameters and the light supplementing lamp array parameters according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value.
In this embodiment, the above-mentioned dynamic image processing method solves the problem in the prior art that the light filling and the image outputting are not good by using one light filling lamp, and in particular, by calling the preset light filling lamp array in the present application, compared with a single light filling lamp, a more flexible and diversified light filling mode can be provided. The light supplementing effect can be adjusted from different angles and different intensities through the combination of the plurality of light supplementing lamps, the problems that the light supplementing of a single light supplementing lamp is possibly uneven in illumination, the light supplementing requirement of a complex scene cannot be met and the like are avoided, the possibility of light supplementing is enriched, a foundation is laid for obtaining a high-quality output image later, and the matching relation between the output image of the shooting requirement and the camera parameter and the environment light parameter is determined according to the relation between the camera parameter and the environment light parameter. The method is not limited to a fixed single light supplementing effect to process the image, but comprehensively considers multiple factors to determine the association of an ideal output image and each parameter in advance.
Further, after the camera and the light supplementing lamp array are started, the camera parameters and the light supplementing lamp array parameters are dynamically adjusted according to the determined matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value. Through the dynamic adjustment mechanism, the light supplementing effect and the camera related setting can be changed in real time according to the actual situation, the image is not processed according to the fixed light supplementing situation in a engraving way, but the light supplementing and the shooting parameters can be flexibly and continuously adjusted according to the image output according to the shooting requirement to be achieved, and therefore the limitation that the output image cannot be adjusted according to the light supplementing effect in the past is effectively overcome.
In an embodiment of the invention, before the preset light compensating lamp array is called according to the current camera parameters and the ambient light parameters, the method further comprises the steps of obtaining the camera parameters and the ambient light parameters, specifically, obtaining the camera parameters including an aperture coefficient f, a shutter speed t, a light sensitivity ISO, a color correction parameter M and a sharpening parameter L, and obtaining the ambient light parameters including a color temperature parameter and an ambient light illumination parameter in the current environment.
Because the parameters of the camera and the ambient light parameters can influence the image, and the ambient light parameters captured by the camera can change under the condition of light supplement, the ambient light parameters must consider the ambient light parameters commonly influenced by the current natural light and the light supplement. Specifically, for example, the aperture factor f determines the amount of light entering the lens, while the shutter speed t controls the time for which light enters, and ISO determines the sensitivity of the sensor to light, and the exposure compensation value (usually in units of EV, such as +1ev, -0.5 EV) is a parameter automatically set by the user or the camera for adjusting exposure. When the exposure compensation is increased (e.g., +1EV), the algorithm will lighten the image by increasing the aperture, slowing the shutter speed, or increasing ISO. This may result in overexposure of the highlight and loss of detail, but may make the dark areas brighter, displaying more detail. The color temperature parameter is used to measure the color characteristics of the light source. The light emitted by the light sources of different color temperatures differs in spectral distribution, for example, light of a low color temperature (e.g. tungsten filament lamp) is reddish yellow, light of a high color temperature (e.g. fluorescent lamp) is bluish white, i.e. the white balance of the camera.
In an embodiment of the present invention, the step S100 may include the following sub-steps:
Determining a camera parameter value range according to the sensor and lens parameters of the current camera;
And calling the light supplementing lamp array according to the parameter value range of the camera, and determining the environment light parameter range of the combination of the environment light supplementing lamp array and the natural light.
Because the sensor of the camera, such as CMOS or CCD, is an important component of the camera, its parameters directly affect the output of the image, for example, the parameters such as CMOS size, lens parameters, and lens size focal length may all have a certain influence on the image, the parameter value range of the camera can be determined by the sensor and lens parameters, and the range of the camera parameter value and the ambient light form the necessary imaging conditions.
In an embodiment of the present invention, step S200 may include current camera parameters according to the aperture coefficient f, shutter speed t, sensitivity ISO, and color correction parameters M and sharpening parameters L of the camera, and matching relations between ambient light parameters including color temperature and ambient illuminance and the output image.
As an example, the present camera parameters according to the aperture coefficient f, shutter speed t, sensitivity ISO, and color correction parameter M and sharpening parameter L of the camera, and the matching relationship between the ambient light parameter including the color temperature and the ambient illuminance and the output image, include:
determining the matching relation of exposure compensation, color and white balance existing between the current camera parameters and the current ambient light parameters through an exposure compensation algorithm, a color correction algorithm and a white balance algorithm;
wherein, the white balance algorithm is as follows:
,,,,,,,, In the formula, I (x, y) represents a pixel with coordinates (x, y) in the image, wherein x and y represent the coordinate positions of the pixel in the image, and the image is color and has R, G, B channels, the subscript R, G, B of I (i.e., I R、IG、IB) represents the pixel values of the red, green and blue channel information respectively, 、、Respectively, the average of the pixels of the red, green and blue channels in the image, reflecting the overall brightness level of the image in each color channel, N being the total number of pixels in the image, used to calculate the channel average, being a fixed value, depending on the resolution of the image, K being the gray value,、、The gain coefficients of the red, green and blue channels, respectively, should ideally be equal according to the gray world assumption, i.e. all equal to a hypothetical gray value K, which is the target value for balancing the color channels in the algorithm, e.g. 128 (for an 8-bit image channel), I (x, y) represents the pixel in the image with coordinates (x, y), which is a pixel value comprising R, G, B three channel information. For example, in a co-RGB image, each pixel location (x, y) has a corresponding red, green, blue intensity value;
the color correction algorithm formula is as follows:
, M= [ M ij ], wherein in the formula, T represents a transpose, the conversion of a row vector into a column vector facilitates matrix multiplication operation, M is a color correction matrix, and M ij is an element of an ith row and a j column in the matrix M; is the color vector of the pixel of the original image, The color corrected pixel color vector contains channel values of red (R '), green (G '), and blue (B ') after color correction;
Specifically, the color correction matrix M may be set to a 3×3 matrix, where the element M ij (i, j=1, 2, 3) determines the transformation manner of each component of the original color vector. These coefficients are determined from the spectral response characteristics of the camera sensor and the conversion relationship of the target color space (e.g., sRGB);
The color vector of the pixel after color correction is Then there isThe method comprises the following steps:
the expanded calculation formula is as follows:
,
,
. These coefficients m ij are determined from the characteristics of the camera sensor and the target color space for adjusting the saturation, hue and brightness of the color.
For example, the coefficient associated with the red channel, such as M 11, in the color matrix M may be increased to increase the red saturation of the image. Specifically, for example, assume that a color vector of one pixel in an original image isIf m 11 is increased from 1 to 1.5, other coefficients are unchanged, and the calculation formula is adoptedThe corrected red channel value R' is changed from 100 to 1.5×100+m 12×100+m13 ×100, and the red channel value is increased, thereby improving red saturation.
The formula of the exposure compensation algorithm is as follows:
,
The exposure E is related to the aperture size f, shutter speed t, and sensitivity ISO;
Let the initial exposure be E 1, the corresponding aperture coefficient be f 1, the shutter speed be t 1, the sensitivity ISO 1, when the light supplementing lamp array is used for supplementing light, the compensated exposure is E 2, the corresponding aperture coefficient be f 2, the shutter speed be t 2, the sensitivity ISO 2, according to the relation among aperture, shutter speed and ISO, when only changing aperture to compensate, the exposure compensation can be realized by the three parameters being adjusted comprehensively in practical application according to the formulas of E 2、f2, t 2 and the sensitivity ISO 2, the specific reference is as follows:
,,, in the formula, E is exposure, which is a physical quantity comprehensively measuring the exposure degree of light on a sensor of a camera, and is related to aperture, shutter speed and sensitivity, f is an aperture coefficient, the square of the inverse of which is in proportion to the quantity of light entering a lens, the smaller the aperture coefficient (the larger the aperture is), the more the quantity of light (also called light flux) enters, t is the shutter speed, which represents the time length of light entering the sensor, wherein the slower the shutter speed is, the more the quantity of light enters, ISO is sensitivity, which represents the sensitivity of the sensor to light, the higher the ISO value is, the more the sensor is sensitive to light at the same aperture and shutter speed, and EV exposure value, which represents the exposure compensation degree, is a relative quantity, wherein EV is positive number, which represents increasing exposure and negative number, represents decreasing exposure.
Specifically, for example, when a photograph of a landscape is taken, the initial exposure is normal. If it is desired to increase the brightness of the image (increase exposure), EV= +1 is set. If only the shutter speed is changed to compensate, according toThe shutter speed t becomes 2 times of the original shutter speed, so that more light enters the camera and the image becomes bright. However, if there is a highlight in the scene (such as water reflection), this may result in overexposure of the highlight and loss of detail. Conversely, if ev= -1, the image will darken, the dark part details may be more abundant, but the highlight part may become darker, even a dead black area appears.
In an embodiment of the present invention, step S200 may further include determining, according to the current camera parameter and the ambient light parameter, that a noise reduction matching relationship and a sharpening matching relationship exist between the current camera parameter and the output image through a noise reduction algorithm and an image sharpening algorithm;
Wherein, the noise reduction algorithm formula:
,
In the formula, I (x, y) represents pixel values with coordinates of (x, y) in the image, I' (x, y) represents pixel values with coordinates of (x, y) in the image after mean filtering, n and m are respectively used for defining the radius of a filtering window in the x and y directions, and S represents a set of pixel values in the window;
Let the original image be I (x, y), and for each pixel (x, y) in the image, perform mean filtering to reduce noise. Assuming that the filter window size is (2n+1) × (2m+1), where n and m are positive integers representing the radius of the window in the x and y directions;
the image sharpening algorithm can adopt a Laplacian operator, and the specific formula is as follows:
,
in the formula, I (x, y) represents a pixel value with coordinates (x, y) in the image, I' (x, y) represents a pixel value after sharpening, L is a laplace operator matrix, k is a sharpening intensity coefficient, and l×i represents convolution operation of the laplace operator L and the image I.
For a two-dimensional image I (x, y), the laplacian (expressed in discrete form) is used to calculate the second derivative of the image, highlighting the edges. The laplace operator is typically represented by a3 x 3 template:
The sharpened image I' (x, y) can be obtained by convolution of the original image with the laplace operator, assuming that the boundary pixel processing uses appropriate boundary conditions, such as zero padding, etc.:
wherein L x I represents a convolution operation of the laplacian L with the image I The Laplace operator template slides on the image, and elements at corresponding positions are multiplied and added to obtain a convolution result.
For example, assuming a noisy image, the pixel values fluctuate widely in a local area. When mean filtering is used to reduce noise, increasing n and m, i.e., enlarging the filter window, will cause more pixels to participate in the average calculation. For example, when n=m=1, the filter window is 3×3 and includes 9 pixels, and when n=m=2, the filter window is 5×5 and includes 25 pixels. The larger the window, the more pronounced the noise reduction effect, but at the same time blurring the image details. Because in detail parts such as edges, a larger window will average the edge pixels with surrounding pixels, resulting in the edges becoming less sharp.
For a slightly blurred image, increasing the sharpening intensity coefficient k will make the edges and details of the image more pronounced. The edge enhancement effect after sharpening is not obvious when k is smaller, and the difference between the edge pixels and surrounding pixels is amplified when k is increased, so that the edge is clearer. However, if k is too large, it may cause "ringing" of the edges, such as an unnatural halo around the edges of the object. Meanwhile, since the sharpening process amplifies the difference between pixels, noise in the image may also be amplified.
In a specific embodiment, the camera and the light compensating lamp array are started, and the parameters of the camera and the parameters of the light compensating lamp array are dynamically adjusted according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value, and the shooting requirement is met through the setting of the upper parameters.
In one embodiment, in a portrait shooting scene, an RGB light filling array is first utilized. According to the distribution characteristics of skin colors in a color space, the skin color presentation is optimized through color matrix parameter adjustment in a color correction algorithm. Assuming that skin color is largely affected by the red and green channels in the RGB color space, the coefficients corresponding to the red and green channels in the color matrix can be appropriately increased (e.g., set and slightly greater than 1) while keeping the blue channel coefficients stable. The overall hue is adjusted using a white balance algorithm. Under different ambient lights (such as indoor tungsten filament lamp or outdoor sunlight), calculating the color temperature of the current ambient light, and obtaining gain coefficients of all channels through a gray world algorithm. For example, under tungsten filament lamp, the gain factor of red channel is relatively smaller, the gain factor of green channel is moderate, the gain factor of blue channel is larger, so as to correct the yellowish tone, make white background or clothes show natural white, and skin color can also show natural health in the adjusted color space. For the light-compensating brightness, the exposure amount is controlled by adjusting the brightness of each lamp in the RGB light-compensating array according to the exposure compensation algorithm. For example, when the ambient light is darker, the light compensation intensity corresponding to the aperture coefficient may be increased (e.g., by increasing the influence coefficient of the parameter related to the aperture on the light compensation intensity), while the light compensation intensity corresponding to the shutter speed is appropriately increased (e.g., by increasing the influence coefficient of the parameter related to the shutter), and the light compensation intensity corresponding to the ISO is finely tuned as required (e.g., by adjusting the influence coefficient of the parameter related to the ISO) to achieve proper exposure, so as to avoid the loss of detail caused by excessive face shadow or excessive light.
In terms of color correction, such as the adjustment of the color matrix coefficients described above, the red and green channel values of the skin tone are formulatedAndChanges are made to change the tone and saturation of the skin tone, making it look more natural and beautiful.
The application of the white balance algorithm enables the whole color of the image to be balanced, and color cast caused by the problem of color temperature of ambient light is avoided. For example, the pixel values are adjusted according to the gain factors calculated by the gray world algorithm, so that the white color and the skin color in the image can keep relatively stable color expression under different light sources.
The exposure compensation algorithm controls the brightness of the compensation light, and influences the overall brightness and detail presentation of the image. If the brightness compensation parameter related to the aperture is increased, more light enters the lens, the exposure amount is increased when other parameters are unchanged according to an exposure equation, the image is lightened, the face details are more clearly visible, but if the brightness compensation parameter is excessively increased, the highlight part (such as a forehead reflecting part) can be overexposed, and the details are lost.
In some examples, in a monitoring camera application scene, RGB light filling arrays are mainly used for filling light in daytime and related algorithms are combined. And calculating a channel gain coefficient according to the color temperature (about 5500-6500K) of natural sunlight by using a white balance algorithm, so as to ensure the natural color of the image. Important target colors in a scene (such as vehicle colors in traffic monitoring and pedestrian clothing colors) are optimized through a color correction algorithm, and a proper color matrix is determined so as to more accurately identify target features in subsequent video analysis.
When the night comes, the infrared lamp light supplementing mode is switched to, and the black-and-white mode shooting is started. At this time, the brightness of the infrared lamp is adjusted by an exposure compensation algorithm, and the exposure amount is controlled by changing the influence coefficients of parameters related to the aperture, shutter speed, and ISO on the compensation brightness. For example, when the illumination is low and a large range needs to be monitored, the light compensation parameters corresponding to the aperture can be increased, the light compensation parameters corresponding to the shutter speed can be properly reduced, so that enough light can enter the lens, and meanwhile, the proper ISO is maintained to reduce noise, so that the outline and the movement track of the target object can be clearly seen on the monitoring picture.
In the black-and-white mode, the sharpening intensity coefficient can be adjusted according to the image sharpening algorithm, the image is sharpened, the edge of the target object is highlighted, and the target detection and recognition algorithm can work better.
The color of the monitoring image is accurate through the application of the white balance and the color correction algorithm, so that the method is beneficial to quickly identifying various elements in a scene when a monitoring picture is manually checked, and meanwhile, accurate data is provided for the subsequent video analysis based on the color characteristics. For example, color matrix parameter adjustment in a color correction algorithm can affect the presentation of different target colors in an image, making features such as vehicle colors easier to distinguish. The brightness and the definition of the monitoring picture are affected by combining the infrared lamp light filling at night with an exposure compensation algorithm. If the brightness compensation parameters corresponding to the aperture are increased, the exposure amount is increased according to an exposure equation, so that more light enters the lens, and a target object in a dark place is more clearly visible, but if the parameters are improperly adjusted, the image can be excessively bright or excessively dark, and the target identification is affected. Under the black-and-white mode, the image sharpening algorithm sharpens the image according to the formula by sharpening intensity coefficients, highlights the edge of the target object, enables the outline of the target object to be clearer, reduces the misjudgment rate of the target recognition algorithm, and improves the accuracy and reliability of the monitoring system.
In an embodiment of the present invention, referring to fig. 2, there is also disclosed a moving image processing apparatus including:
The light filling and calling module 100 is used for calling a preset light filling lamp array according to the current camera parameters and the ambient light parameters;
The parameter matching module 200 is configured to determine a matching relationship between an output image required to be shot and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and through a relationship between the image and the optical parameter;
And the parameter adjusting module 300 is used for starting the camera and the light supplementing lamp array, and dynamically adjusting the parameters of the camera and the light supplementing lamp array according to the matching relation until the difference value between the output image parameter and the shooting requirement parameter meets the preset threshold value.
With reference to FIG. 3, in an embodiment of the present invention, the present invention also provides a computer device, the computer device 12 described above being in the form of a general purpose computing device, and the components of the computer device 12 may include, but are not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 connecting the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus 18 structures, including a memory bus 18 or memory controller, a peripheral bus 18, an accelerated graphics port, a processor, or a local bus 18 using any of a variety of bus 18 architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus 18, micro channel architecture (MAC) bus 18, enhanced ISA bus 18, video Electronics Standards Association (VESA) local bus 18, and Peripheral Component Interconnect (PCI) bus 18.
Computer device 12 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 31 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (commonly referred to as a "hard disk drive"). Although not shown in fig. 3, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk such as a CD-ROM, DVD-ROM, or other optical media may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. The memory may include at least one program product having a set (e.g., at least one) of program modules 42, the program modules 42 being configured to carry out the functions of embodiments of the invention.
A program/utility 41 having a set (at least one) of program modules 42 may be stored, for example, in a memory, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules 42, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
The computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, camera, etc.), one or more devices that enable a user to interact with the computer device 12, and/or any devices (e.g., network card, modem, etc.) that enable the computer device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Moreover, computer device 12 may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet, through network adapter 20. As shown, network adapter 21 communicates with other modules of computer device 12 over bus 18. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with computer device 12, including, but not limited to, microcode, device drivers, redundant processing units 16, external disk drive arrays, RAID systems, tape drives, and data backup storage system 34, among others.
The processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, implementing a moving image processing method provided by an embodiment of the present invention.
The processing unit 16 is implemented when executing the program, and calls a preset light supplementing lamp array according to current camera parameters and ambient light parameters, determines a matching relation between an output image required to be shot and the camera parameters and the ambient light parameters according to the camera parameters and the ambient light parameters and through the relation between the image and the optical parameters, starts the camera and the light supplementing lamp array, and dynamically adjusts the camera parameters and the light supplementing lamp array parameters according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value.
In an embodiment of the present application, the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the moving image processing method as provided in all embodiments of the present application.
The method comprises the steps of calling a preset light supplementing lamp array according to current camera parameters and environment light parameters when a program is executed by a processor, determining a matching relation between an output image required to be shot and the camera parameters and the environment light parameters according to the camera parameters and the environment light parameters and through the relation between the image and the optical parameters, starting the camera and the light supplementing lamp array, and dynamically adjusting the camera parameters and the light supplementing lamp array parameters according to the matching relation until a difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value.
Any combination of one or more computer readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++, python and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flowchart and/or block of the flowchart illustrations and/or block diagrams, and combinations of flowcharts and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above description of the dynamic image processing method, apparatus, electronic device and storage medium provided by the present invention has been provided in detail, and specific examples are used herein to illustrate the principles and embodiments of the present invention, and the description of the above examples is only for aiding in understanding the method and core concept of the present invention, and meanwhile, for those skilled in the art, according to the concept of the present invention, there are variations in the specific embodiments and application ranges, so the disclosure should not be construed as limiting the present invention.
Claims (10)
1. A moving image processing method, characterized by comprising:
S1, calling a preset light supplementing lamp array according to current camera parameters and ambient light parameters;
S2, determining a matching relation between an output image required to be shot and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and through the relation between the image and the optical parameter;
and S3, starting the camera and the light supplementing lamp array, and dynamically adjusting the camera parameters and the light supplementing lamp array parameters according to the matching relation until the difference value between the output image parameters and the shooting requirement parameters meets a preset threshold value.
2. The method according to claim 1, wherein before the calling the preset light compensating lamp array according to the current camera parameter and the ambient light parameter, the method further comprises:
Camera parameters and ambient light parameters are acquired.
3. The moving image processing method according to claim 2, wherein the acquiring camera parameters and ambient light parameters includes:
the method comprises the steps of obtaining camera parameters including an aperture coefficient f, a shutter speed t, a sensitivity ISO, a color correction parameter M and a sharpening parameter L, and obtaining an ambient light parameter including a color temperature parameter and an ambient illuminance parameter in the current environment.
4. The method according to claim 1, wherein the calling a preset light supplement lamp array according to the current camera parameters and the ambient light parameters comprises:
Determining a camera parameter value range according to the sensor and lens parameters of the current camera;
And calling the light supplementing lamp array according to the parameter value range of the camera, and determining the environment light parameter range of the combination of the environment light supplementing lamp array and the natural light.
5. The method according to claim 1, wherein the determining the matching relationship between the output image required for photographing and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and by the relationship between the image and the optical parameter comprises:
According to the aperture coefficient f, shutter speed t, sensitivity ISO, current camera parameters of the color correction parameters M and sharpening parameters L, and the matching relation between the ambient light parameters including color temperature and ambient illuminance and the output image.
6. The moving image processing method according to claim 5, wherein the current camera parameters according to the aperture coefficient f, the shutter speed t, the sensitivity ISO, and the color correction parameter M and the sharpening parameter L of the camera, and the matching relationship between the ambient light parameters including the color temperature and the ambient illuminance and the output image, include:
determining the matching relation of exposure compensation, color and white balance existing between the current camera parameters and the current ambient light parameters through an exposure compensation algorithm, a color correction algorithm and a white balance algorithm;
wherein, the white balance algorithm is as follows:
,,,,,,,, In the formula, I (x, y) represents pixels with coordinates (x, y) in the image, the subscript R, G, B of I represents pixel values of three channel information of red, green and blue respectively, 、、Respectively, the average value of pixels of red, green and blue channels in the image, N is the total number of pixels in the image, K is the gray value,、、Gain coefficients for the red, green, blue channels, respectively;
the color correction algorithm formula is as follows:
, M= [ M ij ], where T represents the transpose, M is the color correction matrix, and M ij is the element of the ith row and j columns in the matrix M; is the color vector of the pixel of the original image, A color corrected pixel color vector;
the formula of the exposure compensation algorithm is as follows:
in the formula, E is the exposure quantity, f aperture coefficient, t shutter speed, ISO sensitivity and EV exposure value.
7. The moving image processing method according to claim 5, wherein the current camera parameters according to the aperture coefficient f, the shutter speed t, the sensitivity ISO, and the color correction parameter M and the sharpening parameter L of the camera, and the matching relationship between the ambient light parameter including the color temperature and the ambient illuminance and the output image, further include:
According to the current camera parameters and the ambient light parameters, determining that a noise reduction matching relationship and a sharpening matching relationship exist between the current camera parameters and the output image through a noise reduction algorithm and an image sharpening algorithm;
Wherein, the noise reduction algorithm formula:
,
In the formula, I (x, y) represents pixel values with coordinates of (x, y) in the image, I' (x, y) represents pixel values with coordinates of (x, y) in the image after mean filtering, n and m are respectively used for defining radius of a filtering window in x and y directions, and a set of pixel values in an S window;
The image sharpening algorithm has the specific formula:
,
in the formula, I (x, y) represents a pixel value with coordinates (x, y) in the image, I' (x, y) represents a pixel value after sharpening, L is a laplace operator matrix, k is a sharpening intensity coefficient, and l×i represents convolution operation of the laplace operator L and the image I.
8. A moving image processing apparatus, comprising:
the light supplementing and calling module is used for calling a preset light supplementing lamp array according to the current camera parameters and the environment light parameters;
the parameter matching module is used for determining the matching relation between the output image required to be shot and the camera parameter and the ambient light parameter according to the camera parameter and the ambient light parameter and through the relation between the image and the optical parameter;
And the parameter adjusting module is used for starting the camera and the light supplementing lamp array, and dynamically adjusting the parameters of the camera and the light supplementing lamp array according to the matching relation until the difference value between the output image parameter and the shooting requirement parameter meets a preset threshold value.
9. An electronic device, comprising:
And one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the electronic device to perform the dynamic image processing method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program stored therein causes a processor to execute the moving image processing method according to any one of claims 1 to 7.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411986650.8A CN119767129A (en) | 2024-12-31 | 2024-12-31 | A dynamic image processing method, device, electronic device and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411986650.8A CN119767129A (en) | 2024-12-31 | 2024-12-31 | A dynamic image processing method, device, electronic device and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN119767129A true CN119767129A (en) | 2025-04-04 |
Family
ID=95183211
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411986650.8A Pending CN119767129A (en) | 2024-12-31 | 2024-12-31 | A dynamic image processing method, device, electronic device and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119767129A (en) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107846554A (en) * | 2017-10-31 | 2018-03-27 | 努比亚技术有限公司 | A kind of image processing method, terminal and computer-readable recording medium |
| CN108174172A (en) * | 2017-12-25 | 2018-06-15 | 广东欧珀移动通信有限公司 | Photographing method and apparatus, computer-readable storage medium, and computer device |
| CN109151257A (en) * | 2018-09-20 | 2019-01-04 | 浙江大华技术股份有限公司 | A kind of method and video camera of image procossing |
| CN110809120A (en) * | 2019-11-01 | 2020-02-18 | 深圳创维-Rgb电子有限公司 | Light supplementing method for shot picture, smart television and computer readable storage medium |
| CN113411507A (en) * | 2021-05-10 | 2021-09-17 | 深圳数联天下智能科技有限公司 | Skin measurement image acquisition method, device, equipment and storage medium |
| WO2022174653A1 (en) * | 2021-02-22 | 2022-08-25 | 华为技术有限公司 | Light adjusting method and apparatus |
| CN118890553A (en) * | 2024-08-20 | 2024-11-01 | 王赳骅 | A control system for camera fill light parameters |
-
2024
- 2024-12-31 CN CN202411986650.8A patent/CN119767129A/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107846554A (en) * | 2017-10-31 | 2018-03-27 | 努比亚技术有限公司 | A kind of image processing method, terminal and computer-readable recording medium |
| CN108174172A (en) * | 2017-12-25 | 2018-06-15 | 广东欧珀移动通信有限公司 | Photographing method and apparatus, computer-readable storage medium, and computer device |
| CN109151257A (en) * | 2018-09-20 | 2019-01-04 | 浙江大华技术股份有限公司 | A kind of method and video camera of image procossing |
| CN110809120A (en) * | 2019-11-01 | 2020-02-18 | 深圳创维-Rgb电子有限公司 | Light supplementing method for shot picture, smart television and computer readable storage medium |
| WO2022174653A1 (en) * | 2021-02-22 | 2022-08-25 | 华为技术有限公司 | Light adjusting method and apparatus |
| CN113411507A (en) * | 2021-05-10 | 2021-09-17 | 深圳数联天下智能科技有限公司 | Skin measurement image acquisition method, device, equipment and storage medium |
| CN118890553A (en) * | 2024-08-20 | 2024-11-01 | 王赳骅 | A control system for camera fill light parameters |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020207239A1 (en) | Method and apparatus for image processing | |
| US9934562B2 (en) | Method for dynamic range editing | |
| US8644638B2 (en) | Automatic localized adjustment of image shadows and highlights | |
| CN110033418B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
| CN112565636B (en) | Image processing method, device, equipment and storage medium | |
| CN113888437A (en) | Image processing method, apparatus, electronic device, and computer-readable storage medium | |
| US10074165B2 (en) | Image composition device, image composition method, and recording medium | |
| JP5139293B2 (en) | Imaging camera processing apparatus and imaging camera processing method | |
| EP3542347B1 (en) | Fast fourier color constancy | |
| US20150170389A1 (en) | Automatic selection of optimum algorithms for high dynamic range image processing based on scene classification | |
| CN107424198A (en) | Image processing method, device, mobile terminal, and computer-readable storage medium | |
| CN109300101A (en) | A Multi-Exposure Image Fusion Method Based on Retinex Theory | |
| CN108510557B (en) | Image tone mapping method and device | |
| US11601600B2 (en) | Control method and electronic device | |
| CN112907497B (en) | Image fusion method and image fusion device | |
| CN108629738A (en) | A kind of image processing method and device | |
| WO2020034702A1 (en) | Control method, device, electronic equipment and computer readable storage medium | |
| US11445127B2 (en) | Leveraging HDR sensors for handling mixed illumination auto white balance | |
| WO2023110878A1 (en) | Image processing methods and systems for generating a training dataset for low-light image enhancement using machine learning models | |
| CN114266803A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
| WO2023110880A1 (en) | Image processing methods and systems for low-light image enhancement using machine learning models | |
| CN118192638A (en) | A UAV control platform capable of three-dimensional holographic inspection | |
| CN107635124A (en) | White balance processing method, device and equipment for face shooting | |
| CN119316725A (en) | Image processing method and electronic device | |
| CN107682611B (en) | Method, apparatus, computer readable storage medium and electronic device for focusing |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |