[go: up one dir, main page]

CN115953332A - Dynamic image fusion brightness adjustment method and system and electronic equipment - Google Patents

Dynamic image fusion brightness adjustment method and system and electronic equipment Download PDF

Info

Publication number
CN115953332A
CN115953332A CN202310246425.XA CN202310246425A CN115953332A CN 115953332 A CN115953332 A CN 115953332A CN 202310246425 A CN202310246425 A CN 202310246425A CN 115953332 A CN115953332 A CN 115953332A
Authority
CN
China
Prior art keywords
img2
brightness
images img1
points
feature points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310246425.XA
Other languages
Chinese (zh)
Other versions
CN115953332B (en
Inventor
宋小民
刘征
姜春桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Guochuang Innovation Vision Ultra HD Video Technology Co.,Ltd.
Original Assignee
Sichuan Xinshi Chuangwei Ultra High Definition Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Xinshi Chuangwei Ultra High Definition Technology Co ltd filed Critical Sichuan Xinshi Chuangwei Ultra High Definition Technology Co ltd
Priority to CN202310246425.XA priority Critical patent/CN115953332B/en
Publication of CN115953332A publication Critical patent/CN115953332A/en
Application granted granted Critical
Publication of CN115953332B publication Critical patent/CN115953332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a brightness adjustment method, a system and electronic equipment for dynamic image fusion, wherein images img1 and img2 are preprocessed, and the images img1 and img2 have the same size and comprise overlapping areas; then extracting key characteristic points in the preprocessed images img1 and img2; and adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2. According to the scheme, the brightness of the position of the characteristic point is obtained by carrying out bilinear interpolation on the brightness of four pixel points adjacent to the characteristic point, and the brightness of a plurality of image sequences is unified by calculating the brightness ratio of the characteristic point, so that the accuracy of the brightness ratio calculation result is improved; meanwhile, a two-stage preprocessing mode is adopted, filtering processing is carried out through image graying and a large filtering kernel, so that the operation amount is reduced, meanwhile, the influence of abnormal points is eliminated aiming at the innovation of the filtering process, the filtering preprocessing effect is ensured, and the relation between the effect and the efficiency is better solved.

Description

Dynamic image fusion brightness adjustment method and system and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a brightness adjustment method, system, and electronic device for dynamic image fusion.
Background
As a core technology in video and image processing, image stitching refers to connecting two or more small-view images with overlapping areas together to finally obtain a large-view image. The method is widely applied to the fields of motion tracking and analysis, large-scene monitoring, virtual reality, automobile auxiliary driving, broadcasting and television media and the like. Image registration and image fusion are two key technologies for image splicing, wherein in the process of image fusion, due to the influence of shooting conditions or external factors, the brightness difference of two images to be fused is likely to be large, a common method is to calculate the brightness sum of all pixel points in the overlapped part of the two images, namely the part to be fused, to obtain the ratio of the brightness sum of the reference image and the observed image, and then adjust the brightness of the observed image according to the ratio. However, for a moving scene, if the moving object is located just outside the overlapping region of the reference image but in the overlapping region of the observed image, the luminance ratio is calculated according to a conventional method, and due to the difference of the fusion part between the reference image and the observed image, the calculation result of the luminance ratio is inaccurate, and the image fusion effect is affected.
Aiming at the instability of the image fusion brightness effect of the traditional method in a motion scene, the patent provides an improved image fusion brightness adjustment method in the motion scene, the brightness of the position of a characteristic point is obtained by utilizing the brightness of four pixel points adjacent to the characteristic point to carry out bilinear interpolation, the brightness of a plurality of image sequences is unified by calculating the brightness ratio of the characteristic point, the accuracy of the brightness ratio calculation result is improved, and the optimal image splicing effect is obtained.
Therefore, it is necessary to provide a method, a system and an electronic device for adjusting brightness in dynamic image fusion to solve the above technical problems.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides a brightness adjustment method, system and electronic device for dynamic image fusion.
The invention provides a brightness adjustment method for dynamic image fusion, which comprises the following steps:
preprocessing images img1 and img2, wherein the images img1 and img2 are the same in size and comprise overlapping areas;
extracting key characteristic points in the preprocessed images img1 and img2;
and adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
Preferably, the preprocessed images img1 and img2 include:
graying the images img1 and img2 to obtain grayscale images img1 and img2;
the grayscale images img1 and img2 are noise-reduced.
Preferably, the extracting key feature points in the preprocessed images img1 and img2 includes:
extracting feature point information of the preprocessed images img1 and img2, wherein the feature point information comprises positions, scales and directions;
carrying out feature description on the feature points;
and matching key feature points in the images img1 and img2, and removing the feature points with wrong matching.
Preferably, the extracting the feature point information of the preprocessed images img1 and img2 includes:
establishing a Gaussian pyramid using a Gaussian function
Figure SMS_1
And the difference of gauss pyramid->
Figure SMS_2
Searching local extreme points through a Gaussian difference pyramid;
fitting discrete characteristic points;
the principal direction of each feature point is calculated.
Preferably, the characterizing the feature points includes:
the feature description includes feature points and surrounding pixel points that contribute to them.
Preferably, the matching key feature points in the images img1 and img2 include:
calculating Euclidean distances of feature vectors of feature points of the images img1 and img2 in 128-dimensional direction;
judging the similarity of the corresponding feature points of the images img1 and img2 according to the Euclidean distance;
and matching the characteristic points meeting the Euclidean distance condition.
Preferably, the characteristic points with the matching errors are removed by a robust parameter estimation method.
Preferably, the adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2 includes:
obtaining the positions and the brightness of four adjacent pixel points according to the positions of the characteristic points;
screening the brightness of four adjacent pixel points;
calculating the position brightness of the feature points according to the screening result;
calculating the average value of the brightness ratios of all the characteristic points of the images img1 and img2;
the brightness of the image img2 is adjusted using the average value.
A dynamic image fused brightness adjustment system, comprising:
the preprocessing module is used for preprocessing the images img1 and img2;
the feature extraction module is used for extracting key feature points in the preprocessed images img1 and img2;
and the brightness adjusting module is used for adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
An electronic device comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, and wherein the processor executes the computer program to implement the steps of the method.
Compared with the related art, the brightness adjusting method, the system and the electronic equipment for dynamic image fusion provided by the invention have the following beneficial effects:
1. the method comprises the steps of preprocessing images img1 and img2, wherein the images img1 and img2 are the same in size and comprise overlapping areas; then extracting key feature points in the preprocessed images img1 and img2; and finally, adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2. According to the scheme, the brightness of the position of the characteristic point is obtained by carrying out bilinear interpolation on the brightness of four pixel points adjacent to the characteristic point, the brightness of a plurality of image sequences is unified by calculating the brightness ratio of the characteristic point, the accuracy of the brightness ratio calculation result is improved, and the optimal image splicing effect is obtained.
2. The invention adopts a two-stage preprocessing mode, filters by image graying and large filtering kernel to reduce the operation amount, and simultaneously creatively eliminates the influence of abnormal points in the filtering process, thereby ensuring the filtering preprocessing effect, better solving the relation between the effect and the efficiency, and overcoming the defect that the preprocessing effect and the operation complexity can not be well considered in the color image preprocessing in the prior art.
Drawings
FIG. 1 is a schematic flow chart of a brightness adjustment method for dynamic image fusion according to the present invention;
FIG. 2 is a schematic overall flow chart of a brightness adjustment method for dynamic image fusion according to the present invention;
FIG. 3 is a schematic diagram of a preprocessing flow of a brightness adjustment method for dynamic image fusion according to the present invention;
FIG. 4 is a schematic diagram of a feature extraction flow of a brightness adjustment method for dynamic image fusion according to the present invention;
FIG. 5 is a schematic diagram illustrating a process of adjusting image brightness according to the brightness adjustment method for dynamic image fusion disclosed in the present invention;
FIG. 6 is a schematic diagram illustrating distribution of feature points and neighboring pixels in a brightness adjustment method for dynamic image fusion according to the present invention;
FIG. 7 is a schematic structural diagram of a system for adjusting brightness in dynamic image fusion according to the present invention;
fig. 8 is a schematic structural diagram of an electronic device disclosed in the present invention.
Detailed Description
The invention is further described below with reference to the drawings and the embodiments.
In this embodiment, as shown in fig. 1 and fig. 2, the method for adjusting brightness of dynamic image fusion includes:
step S100: the images img1 and img2 are preprocessed by a preprocessing module, the images img1 and img2 being the same size and including an overlapping area.
Specifically, for example, it is set that both the images img1 and img2 have a resolution 8000 (W)
Figure SMS_3
6000 (H), an 8K color digital image of the RGB color channel, i.e. img1 (x, y, z), img2 (x, y, z), where x =8000, y =6000, z =3.
The steps of pre-processing the images img1 and img2 are shown in fig. 3 and comprise the following operating steps:
step S101: graying the images img1 and img2 to obtain grayscale images img1 and img2;
specifically, in consideration of the independence of the method on color information and the reduction of the amount of computation, it is necessary to convert the color images img1 (x, y, z) and img2 (x, y, z) into the grayscale images img1 (x, y) and img2 (x, y).
Step S102: the grayscale images img1 and img2 are noise-reduced.
Specifically, in order to further improve the accuracy of the calculation result, the method needs to perform noise reduction processing on the gray image obtained in step S101, perform mean filtering by using a filter kernel of size 7*7, and meanwhile, to avoid the influence of abnormal data (such as bad points, isolated points, and the like), remove 10% of the maximum value and the minimum value as the filtering result;
step S200: extracting key feature points in the preprocessed images img1 and img2 by using a feature extraction module;
specifically, the steps are as shown in fig. 4, and include the following steps:
step S201: extracting feature point information of the preprocessed images img1 and img2, wherein the feature point information comprises a position, a scale and a direction;
specifically, the method for extracting the feature points includes the following steps:
firstly, a Gaussian pyramid is established by utilizing a Gaussian function
Figure SMS_4
And the difference of gauss pyramid->
Figure SMS_5
Figure SMS_6
Figure SMS_7
Where m, n represent the dimensions of the Gaussian template, (x, y) represent the position of the image pixel,
Figure SMS_8
is a scale space factor, is selected>
Figure SMS_9
Is a convolution.
Secondly, searching local extreme points and functions by using a differential pyramid
Figure SMS_10
Each sampling point in the image is compared with the surrounding pixel points, if the sampling point is an extreme point, the point is a characteristic point of the image under the scale, and the surrounding pixel points of the sampling point comprise eight adjacent pixel points with the same scale and other nine pixel points with the same scale and the same position.
And then, performing three-dimensional quadratic function fitting on the discrete characteristic points to obtain the positions and scales of the accurate characteristic points, calculating a 2 multiplied by 2 Hessian matrix to obtain the main curvature of the peak points in the edge direction, removing edge response, and taking T as 1.2.
Figure SMS_11
Wherein,
Figure SMS_12
,D xx representing a certain scale in the Gaussian difference pyramidTwice successive derivation in the x-direction of the image D yy Two successive derivations in the y-direction of an image representing a certain scale in a Gaussian difference pyramid, D xy The x and y directions of the image representing a certain scale in the gaussian difference pyramid are each derived once.
And finally, calculating a direction for each feature point, specifically counting the gradient and the direction of pixels in the neighborhood of the feature point by using a histogram, and taking the maximum value in the histogram as the main direction of the feature point.
Step S202: carrying out feature description on the feature points;
specifically, feature points of the images img1 and img2 are described, namely the feature points are described by using a group of vectors and are not changed along with changes of visual angles and the like;
in addition, the feature description contains feature points and surrounding pixel points contributing thereto; the feature description uses the gradient information of eight directions calculated in the grid of 4*4 within the feature point scale space, for a total of 4 × 8= 128-dimensional vector representation.
Step S203: matching key feature points in the images img1 and img2, and eliminating feature points with wrong matching, specifically comprising the following operation steps:
first, euclidean distances in 128 dimensions of feature vectors of feature points of the images img1 and img2 are calculated, and the calculation formula is as follows:
Figure SMS_13
wherein j is dimension, r i And s i The horizontal and vertical coordinate positions of the pixel points are represented, and d is an Euclidean distance.
Secondly, the similarity of the two feature points is judged by utilizing the Euclidean distance, for the feature point A in the img1, the feature point B which is the nearest to the Euclidean distance in the img2 and the second nearest feature point C, and the ratio of the Euclidean distance is less than 0.6, the A and the B are successfully matched.
And finally, screening out characteristic points based on Euclidean distance matching errors by using a robustness estimation method.
Step S300: and adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
Specifically, as shown in fig. 5, the method includes the following steps:
step S301: obtaining the positions and the brightness of four adjacent pixel points according to the positions of the characteristic points;
specifically, as shown in fig. 6, the positions and the luminances of the four adjacent pixels are obtained according to the positions of the feature points, because the accuracy of the feature points needs to be ensured in the feature point matching process, and the coordinates of the feature points are usually decimal, the luminances of the feature points cannot be directly obtained during the luminance calculation, in the present scheme, the luminances of the four adjacent pixels Q11 (x 1, y 1), Q12 (x 1, y 2), Q21 (x 2, y 1), and Q22 (x 2, y 2) around the feature point P (x, y) are utilized in the present scheme
Figure SMS_14
Figure SMS_15
Figure SMS_16
Figure SMS_17
And calculating the brightness of the feature points.
Step S302: screening the brightness of four adjacent pixel points;
specifically, the brightness of adjacent pixel points
Figure SMS_18
Figure SMS_19
Figure SMS_20
Figure SMS_21
And screening, and judging the difference between the brightness of each point and the average brightness value, wherein the point brightness with the difference larger than 15% is regarded as a point with larger fluctuation.
Step S303: calculating the position brightness of the feature points according to the screening result;
specifically, calculating the brightness of the feature point, and according to the screening result of the brightness, if there is only one difference point and the difference point is the closest point to the feature point, calculating the brightness of the feature point according to the brightness of the point;
if the situation is other, calculating the brightness of the positions of the characteristic points according to bilinear interpolation
Figure SMS_22
Figure SMS_23
Step S304: calculating the average value of the brightness ratios of all the characteristic points of the images img1 and img2;
specifically, the luminance ratio of all the feature points img1 and img2 is calculated
Figure SMS_24
In, to>
Figure SMS_25
Sorting is carried out, the maximum and minimum 10% of average value R is removed, and the average value R is used as the adjusted brightness ratio, and the calculation formula is as follows:
Figure SMS_26
wherein, f 1 Brightness, f, representing img1 2 Representing the brightness of img2.
Step S305: adjusting the brightness of the image img2 by using the average value;
specifically, the luminance of img2 is adjusted by using the obtained luminance ratio average value R, and the formula is as follows:
Figure SMS_27
the method is simple, efficient and good in universality, is suitable for various motion non-motion scenes, does not distinguish videos and images, does not distinguish resolution ratios, is compatible with 8K/4K/2K/1080P and the like, and is more accurate in calculation result by comparing the brightness ratio of the feature points compared with the traditional method for calculating the brightness sum of the fusion part of two images to obtain the brightness ratio.
The following table shows that the brightness ratio calculated by the traditional scheme and the scheme is normalized by the brightness ratio calculated when no object moves, and the result shows that the larger the moving object is, the more accurate the calculation of the method is.
Exercise program object size Conventional scheme (normalization) This scheme (normalization)
No-object movement 1 1
Small moving object 0.993 0.997
Moderate moving object 0.972 1.001
Large moving object 0.953 0.996
A brightness adjustment system for dynamic image fusion, as shown in fig. 6, includes:
the preprocessing module is used for preprocessing the images img1 and img2;
specifically, the preprocessing module is used for graying the images img1 and img2, acquiring the grayscale images img1 and img2, and denoising the grayscale images img1 and img2
The feature extraction module is used for extracting key feature points in the preprocessed images img1 and img2;
specifically, the feature extraction module is configured to extract feature point information of the preprocessed images img1 and img2, where the feature point information includes a position, a scale, and a direction;
furthermore, the feature extraction module is used for establishing a Gaussian pyramid and a Gaussian difference pyramid by utilizing a Gaussian function; searching local extreme points through a Gaussian difference pyramid; fitting discrete characteristic points; calculating the main direction of each feature point;
specifically, the feature extraction module is configured to perform feature description on the feature points;
specifically, the feature extraction module is used for matching key feature points in the images img1 and img2 and eliminating feature points with wrong matching.
Furthermore, the feature extraction module is used for calculating Euclidean distances of feature vectors of feature points of the images img1 and img2 in a dimensional direction; judging the similarity of the corresponding feature points of the images img1 and img2 according to the Euclidean distance; and matching the characteristic points meeting the Euclidean distance condition.
The brightness adjusting module is used for adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2;
specifically, the brightness adjusting module is used for obtaining the positions and the brightness of four adjacent pixel points according to the positions of the feature points; screening the brightness of four adjacent pixel points; calculating the position brightness of the feature points according to the screening result; calculating the average value of the brightness ratios of all the characteristic points of the images img1 and img2; the brightness of the image img2 is adjusted using the average value.
The brightness adjustment system for dynamic image fusion disclosed in this embodiment is implemented based on the brightness adjustment method for dynamic image fusion disclosed in the above embodiment, and is not described herein again.
In the brightness adjustment system for dynamic image fusion disclosed in this embodiment, by preprocessing the images img1 and img2, the images img1 and img2 have the same size and include an overlapping region; then extracting key feature points in the preprocessed images img1 and img2; and finally, adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2. According to the scheme, the brightness of the position of the characteristic point is obtained by carrying out bilinear interpolation on the brightness of four pixel points adjacent to the characteristic point, the brightness of a plurality of image sequences is unified by calculating the brightness ratio of the characteristic point, and the accuracy of the brightness ratio calculation result is improved; meanwhile, a two-stage preprocessing mode is adopted, filtering processing is carried out through image graying and a large filtering kernel, so that the operation amount is reduced, meanwhile, the influence of abnormal points is eliminated aiming at the innovation of the filtering process, the filtering preprocessing effect is ensured, and the relation between the effect and the efficiency is better solved.
The embodiment discloses an electronic device, as shown in fig. 7, including a memory and a processor, where the memory stores a computer program operable on the processor, and the processor executes the computer program to implement a method for adjusting brightness of dynamic image fusion, including:
wherein the processor is used for preprocessing the images img1 and img2, and the images img1 and img2 are the same in size and comprise an overlapping area; extracting key characteristic points in the preprocessed images img1 and img2;
and adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
The memory is used for storing programs for the processor to execute the processing procedures.
The electronic device disclosed in this embodiment is implemented based on the brightness adjustment method for dynamic image fusion disclosed in the above embodiment, and details are not repeated here.
In the electronic device disclosed in the embodiment, by preprocessing the images img1 and img2, the images img1 and img2 have the same size and include an overlapping area; then extracting key characteristic points in the preprocessed images img1 and img2; and finally, adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2. According to the scheme, the brightness of the position of the characteristic point is obtained by carrying out bilinear interpolation on the brightness of four pixel points adjacent to the characteristic point, the brightness of a plurality of image sequences is unified by calculating the brightness ratio of the characteristic point, and the accuracy of the brightness ratio calculation result is improved; meanwhile, a two-stage preprocessing mode is adopted, filtering processing is carried out through image graying and a large filtering kernel, so that the operation amount is reduced, meanwhile, the influence of abnormal points is eliminated aiming at the innovation of the filtering process, the filtering preprocessing effect is ensured, and the relation between the effect and the efficiency is better solved.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A brightness adjustment method for dynamic image fusion is characterized by comprising the following steps:
preprocessing images img1 and img2, the images img1 and img2 being the same size and including an overlapping region;
extracting key characteristic points in the preprocessed images img1 and img2;
and adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
2. The brightness adjustment method for dynamic image fusion according to claim 1, wherein the preprocessing the images img1 and img2 comprises:
graying the images img1 and img2 to obtain grayscale images img1 and img2;
the grayscale images img1 and img2 are noise-reduced.
3. The brightness adjustment method for dynamic image fusion according to claim 1, wherein the extracting key feature points in the preprocessed images img1 and img2 comprises:
extracting feature point information of the preprocessed images img1 and img2, wherein the feature point information comprises positions, scales and directions;
carrying out feature description on the feature points;
and matching key feature points in the images img1 and img2, and removing the feature points with wrong matching.
4. The method as claimed in claim 3, wherein the extracting feature point information of the preprocessed images img1 and img2 includes:
establishing a Gaussian pyramid using a Gaussian function
Figure QLYQS_1
And a Gaussian difference pyramid->
Figure QLYQS_2
Searching local extreme points through a Gaussian difference pyramid;
fitting discrete characteristic points;
the principal direction of each feature point is calculated.
5. The method according to claim 3, wherein the characterizing the feature points comprises:
the feature description includes feature points and surrounding pixel points contributing thereto.
6. The brightness adjustment method for dynamic image fusion according to claim 3, wherein matching the key feature points in the images img1 and img2 comprises:
calculating Euclidean distances of feature vectors of feature points of the images img1 and img2 in 128-dimensional direction;
judging the similarity of the corresponding feature points of the images img1 and img2 according to the Euclidean distance;
and matching the characteristic points meeting the Euclidean distance condition.
7. The luminance adjustment method for dynamic image fusion according to claim 3, wherein the feature points with the removed matching errors are subjected to a robust parameter estimation method.
8. The method as claimed in claim 1, wherein the adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2 comprises:
obtaining the positions and the brightness of four adjacent pixel points according to the positions of the characteristic points;
screening the brightness of four adjacent pixel points;
calculating the position brightness of the feature points according to the screening result;
calculating the average value of the brightness ratios of all the characteristic points of the images img1 and img2;
the brightness of the image img2 is adjusted using the average value.
9. A system for adjusting brightness in dynamic image fusion, comprising:
the preprocessing module is used for preprocessing the images img1 and img2;
the feature extraction module is used for extracting key feature points in the preprocessed images img1 and img2;
and the brightness adjusting module is used for adjusting the brightness of the image img2 according to the average brightness ratio of the key feature points in the images img1 and img2.
10. An electronic device comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, and wherein the processor executes the computer program to perform the steps of the method according to any of the preceding claims 1 to 8.
CN202310246425.XA 2023-03-15 2023-03-15 Dynamic image fusion brightness adjustment method, system and electronic equipment Active CN115953332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310246425.XA CN115953332B (en) 2023-03-15 2023-03-15 Dynamic image fusion brightness adjustment method, system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310246425.XA CN115953332B (en) 2023-03-15 2023-03-15 Dynamic image fusion brightness adjustment method, system and electronic equipment

Publications (2)

Publication Number Publication Date
CN115953332A true CN115953332A (en) 2023-04-11
CN115953332B CN115953332B (en) 2023-08-18

Family

ID=87286395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310246425.XA Active CN115953332B (en) 2023-03-15 2023-03-15 Dynamic image fusion brightness adjustment method, system and electronic equipment

Country Status (1)

Country Link
CN (1) CN115953332B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011013862A1 (en) * 2009-07-28 2011-02-03 주식회사 유진로봇 Control method for localization and navigation of mobile robot and mobile robot using same
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
US20170068315A1 (en) * 2015-09-07 2017-03-09 Samsung Electronics Co., Ltd. Method and apparatus for eye tracking
CN107220955A (en) * 2017-04-24 2017-09-29 东北大学 A kind of brightness of image equalization methods based on overlapping region characteristic point pair
CN108416732A (en) * 2018-02-02 2018-08-17 重庆邮电大学 A Panoramic Image Stitching Method Based on Image Registration and Multi-resolution Fusion
CN108460724A (en) * 2018-02-05 2018-08-28 湖北工业大学 The Adaptive image fusion method and system differentiated based on mahalanobis distance
US20180268239A1 (en) * 2011-11-14 2018-09-20 San Diego State University Research Foundation Method and System of Image-Based Change Detection
CN108986174A (en) * 2018-06-06 2018-12-11 链家网(北京)科技有限公司 A kind of global tone mapping method and system for high dynamic range across picture
JP2019101997A (en) * 2017-12-07 2019-06-24 キヤノン株式会社 Image processing apparatus and image processing method reducing noise by composing plural captured images
CN111260543A (en) * 2020-01-19 2020-06-09 浙江大学 An underwater image stitching method based on multi-scale image fusion and SIFT features
CN113255696A (en) * 2021-05-25 2021-08-13 深圳市亚辉龙生物科技股份有限公司 Image recognition method and device, computer equipment and storage medium
CN113284063A (en) * 2021-05-24 2021-08-20 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
WO2021237732A1 (en) * 2020-05-29 2021-12-02 北京小米移动软件有限公司南京分公司 Image alignment method and apparatus, electronic device, and storage medium
CN114913071A (en) * 2022-05-16 2022-08-16 扬州大学 Underwater image splicing method integrating feature point matching of brightness region information
CN115471682A (en) * 2022-09-13 2022-12-13 杭州电子科技大学 Image matching method based on SIFT fusion ResNet50

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011013862A1 (en) * 2009-07-28 2011-02-03 주식회사 유진로봇 Control method for localization and navigation of mobile robot and mobile robot using same
US20180268239A1 (en) * 2011-11-14 2018-09-20 San Diego State University Research Foundation Method and System of Image-Based Change Detection
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
US20170068315A1 (en) * 2015-09-07 2017-03-09 Samsung Electronics Co., Ltd. Method and apparatus for eye tracking
CN107220955A (en) * 2017-04-24 2017-09-29 东北大学 A kind of brightness of image equalization methods based on overlapping region characteristic point pair
JP2019101997A (en) * 2017-12-07 2019-06-24 キヤノン株式会社 Image processing apparatus and image processing method reducing noise by composing plural captured images
CN108416732A (en) * 2018-02-02 2018-08-17 重庆邮电大学 A Panoramic Image Stitching Method Based on Image Registration and Multi-resolution Fusion
CN108460724A (en) * 2018-02-05 2018-08-28 湖北工业大学 The Adaptive image fusion method and system differentiated based on mahalanobis distance
CN108986174A (en) * 2018-06-06 2018-12-11 链家网(北京)科技有限公司 A kind of global tone mapping method and system for high dynamic range across picture
CN111260543A (en) * 2020-01-19 2020-06-09 浙江大学 An underwater image stitching method based on multi-scale image fusion and SIFT features
WO2021237732A1 (en) * 2020-05-29 2021-12-02 北京小米移动软件有限公司南京分公司 Image alignment method and apparatus, electronic device, and storage medium
CN113284063A (en) * 2021-05-24 2021-08-20 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN113255696A (en) * 2021-05-25 2021-08-13 深圳市亚辉龙生物科技股份有限公司 Image recognition method and device, computer equipment and storage medium
CN114913071A (en) * 2022-05-16 2022-08-16 扬州大学 Underwater image splicing method integrating feature point matching of brightness region information
CN115471682A (en) * 2022-09-13 2022-12-13 杭州电子科技大学 Image matching method based on SIFT fusion ResNet50

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LI H等: "Multi-Exposure Fusion with CNN Features", 《IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING》, pages 1722 - 1727 *
张雨荷: "基于动态场景的高动态范围图像合成算法研究", 《中国优秀硕士学位论文全文数据库 工程科技I辑》, no. 2, pages 138 - 1460 *

Also Published As

Publication number Publication date
CN115953332B (en) 2023-08-18

Similar Documents

Publication Publication Date Title
CN109785291B (en) Lane line self-adaptive detection method
US8401333B2 (en) Image processing method and apparatus for multi-resolution feature based image registration
US12175691B1 (en) Method and device for mapping three-dimensional (3D) point cloud model based on deep learning
CN110992263B (en) Image stitching method and system
CN110827200A (en) An image superdivision reconstruction method, an image superdivision reconstruction device and a mobile terminal
CN107945111B (en) An Image Mosaic Method Based on SURF Feature Extraction and CS-LBP Descriptor
TWI639136B (en) Real-time video stitching method
CN102713938A (en) Scale space normalization technique for improved feature detection in uniform and non-uniform illumination changes
US20090226097A1 (en) Image processing apparatus
Lee et al. Accurate registration using adaptive block processing for multispectral images
CN118674723B (en) Method for detecting virtual edges of coated ceramic area based on deep learning
CN113723399A (en) License plate image correction method, license plate image correction device and storage medium
CN111553927B (en) Checkerboard corner detection method, detection system, computer device and storage medium
CN113793372A (en) Optimal registration method and system for different-source images
CN112435283A (en) Image registration method, electronic device and computer-readable storage medium
US20210398256A1 (en) Method and apparatus for enhanced anti-aliasing filtering on a gpu
CN113470056B (en) Sub-pixel edge point detection method based on Gaussian model convolution
CN119741196B (en) High-robustness image stitching method
CN115953332B (en) Dynamic image fusion brightness adjustment method, system and electronic equipment
US20120038785A1 (en) Method for producing high resolution image
CN119090926B (en) Difference graph registration method based on feature point matching
JP5928465B2 (en) Degradation restoration system, degradation restoration method and program
CN113077390B (en) Image rectification method based on deep learning
CN117934568A (en) Cross-modal image registration method based on multi-scale local salient principal direction features
CN110363723B (en) Image processing method and device for improving image boundary effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: No. 401, 4th Floor, Building 2, No. 88 Shengtong Street, Chengdu High tech Zone, China (Sichuan) Pilot Free Trade Zone, Chengdu City, Sichuan Province 610095

Patentee after: Sichuan Guochuang Innovation Vision Ultra HD Video Technology Co.,Ltd.

Country or region after: China

Address before: No. 2, Xinyuan south 2nd Road, Chengdu, Sichuan 610000

Patentee before: Sichuan Xinshi Chuangwei ultra high definition Technology Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address