[go: up one dir, main page]

CN116563126B - A visible light infrared image fusion method based on least squares filtering - Google Patents

A visible light infrared image fusion method based on least squares filtering

Info

Publication number
CN116563126B
CN116563126B CN202210109068.8A CN202210109068A CN116563126B CN 116563126 B CN116563126 B CN 116563126B CN 202210109068 A CN202210109068 A CN 202210109068A CN 116563126 B CN116563126 B CN 116563126B
Authority
CN
China
Prior art keywords
image
visible light
vis
infrared
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210109068.8A
Other languages
Chinese (zh)
Other versions
CN116563126A (en
Inventor
王佳佳
孙长燕
廉黎
朱亮
苏子航
饶志涛
潘少鹏
张艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huahang Radio Measurement Research Institute
Original Assignee
Beijing Huahang Radio Measurement Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huahang Radio Measurement Research Institute filed Critical Beijing Huahang Radio Measurement Research Institute
Priority to CN202210109068.8A priority Critical patent/CN116563126B/en
Publication of CN116563126A publication Critical patent/CN116563126A/en
Application granted granted Critical
Publication of CN116563126B publication Critical patent/CN116563126B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

本发明涉及一种基于最小二乘滤波的可见光红外图像融合方法,属于图像处理技术领域,解决了现有的可见光红外图像融合方法融合后的图像中对比度明显但像素强度不高的细节部分缺失,图像边缘较模糊的问题。一种基于最小二乘滤波的可见光红外图像融合方法,包括以下步骤:求取可见光灰度图像的最小二乘滤波后的细节层图像;对可见光最小二乘细节层和融合结果细节层获取细节层平均值;对融合结果背景层和细节层平均值进行融合,得到最终融合结果图像。本发明使得图像区域中对比度明显但像素强度值不高的细节信息得以保留;本发明结合最小二乘法和导向滤波能够更好的保持边缘的特性,使融合后的图像获得更多的有效信息。

The present invention relates to a visible light infrared image fusion method based on least squares filtering, which belongs to the field of image processing technology. It solves the problem that in the existing visible light infrared image fusion method, details with obvious contrast but low pixel intensity are missing and the image edges are blurred in the fused image. A visible light infrared image fusion method based on least squares filtering includes the following steps: obtaining a detail layer image after least squares filtering of a visible light grayscale image; obtaining a detail layer average value for the visible light least squares detail layer and the fusion result detail layer; fusing the fusion result background layer and the detail layer average value to obtain a final fusion result image. The present invention allows detail information in image areas with obvious contrast but low pixel intensity values to be retained; the present invention combines the least squares method with guided filtering to better maintain edge characteristics, so that the fused image obtains more effective information.

Description

Visible light infrared image fusion method based on least square filtering
Technical Field
The invention belongs to the field of image processing, and particularly relates to a least square filtering-based visible light infrared image fusion method.
Background
Infrared imaging is becoming more and more widely used in the imaging field because it can easily obtain temperature information of a target, and can work around the clock. However, infrared imaging has its limitations in that it is difficult to distinguish objects from the background in the case of temperature insensitivity.
In contrast, visible light intensity images have more image detail and texture information, but are susceptible to external factors such as environment, weather, etc., and the target information is not obvious in the case of low contrast or cluttered background.
In recent years, in order to fully develop the advantages of the infrared image and the visible light image, the visible light infrared image fusion technology becomes an important branch in the multiband image fusion field, and the visible light infrared image fusion technology fuses the effective information of the visible light image and the infrared image, so that the visible light infrared image fusion technology has important significance for human visual perception, target detection and identification.
The existing visible light infrared image fusion method comprises an image fusion method based on pixel saliency and a linear fusion method, wherein detail parts with obvious contrast but low pixel intensity in the fused image are missing, and the image edge is blurred.
Disclosure of Invention
In view of the analysis, the invention aims to provide a least square filtering-based visible light infrared image fusion method, which is used for solving the problems that detail parts with obvious contrast but low pixel intensity in an image fused by the existing visible light infrared image fusion method are missing and the image edge is blurred.
The aim of the invention is mainly realized by the following technical scheme:
a least square filtering-based visible light infrared image fusion method comprises the following steps:
step 1, obtaining a detail layer image after least square filtering of a visible light gray level image;
step 2, solving a high-level gradient and low-vertical gradient map for the input visible light and infrared gray level image;
Step 3, comparing high-level gradient and low-vertical gradient images of the visible light and infrared gray level images, and obtaining a contrast saliency image I Statistical;
Step 4, obtaining guide image images P Vis and P IR capable of combining the light image and the infrared image based on the contrast saliency image;
step 5, carrying out mean value filtering on the input visible light and infrared gray level images to obtain a detail layer and a background layer;
Step 6, respectively conducting guide filtering of different filters and different fuzzy coefficients on the input visible light and infrared gray level images according to the obtained guide image images P Vis and P IR to obtain a visible light and infrared image detail layer weight coefficient Wd Vis、WdIR and a background layer weight coefficient Wb Vis、WbIR;
Step 7, weighting the detail layer weight coefficient Wd Vis、WdIR and the weight coefficient Wb Vis、WbIR of the background layer obtained in the step 5 with the visible light, infrared detail layer and the background layer to obtain a detail layer FUSION D and a background layer FUSION B of the fused image;
Step 8, obtaining a detail layer average value for the visible light least square detail layer obtained in the step 1 and the fusion result detail layer obtained in the step 7;
And 9, fusing the background layer image obtained in the step 7 and the detail layer average value obtained in the step 8 to obtain a final fusion result image.
Further, in step 1, the least squares filter is first used to obtain an image VIS WLS of the visible light gray-scale image filtered by the least squares filter, and then the least squares filtered detail layer image VIS WLS-D of the visible light gray-scale image is obtained.
Further, in step 2, the high horizontal gradient and the low vertical gradient of the visible light and infrared grayscale images are obtained by subtracting the gradient map in the y direction from the gradient map in the x direction.
Further, in step 2, the operator for obtaining the gradient map uses a Sobel operator.
Further, in step 3, each pixel point of the contrast saliency image takes the maximum value of the two high-level gradient and low-level gradient image matrixes of visible light and infrared light, and the contrast saliency image I Statistical is obtained.
In step 4, the method for obtaining the guide map images P Vis and P IR of the visible light image and the infrared image is to compare the contrast significant value image with the high-level gradient and low-level gradient map of the visible light and the infrared light respectively, wherein the pixel point position guide map values with equal values are 1, and the pixel point position guide map values with unequal values are 0.
Further, in step 5, the detail layer and the background layer of the input visible light and infrared gray level image are obtained by respectively performing mean filtering, wherein the smoothness of the background layer is very high, a 31×31 mean filter is adopted to obtain the background layer, and the detail layer is calculated according to the obtained background layer:
Vismean-d=Vis-Vismean-b (6)
IRmean-d=IR-IRmean-b (7)
Wherein, vis mean-d、IRmean-d is the detail layer of the visible light and infrared gray level image after mean value filtering, vis is the visible light gray level image, IR infrared gray level image, vis mean-b、IRmean-b is the background layer of the visible light and infrared gray level image after mean value filtering.
Further, in step 6, the process of obtaining the weight coefficient Wd Vis of the detail layer of the visible light image by using the guide filter function is represented by formula (8), and the process of obtaining the weight coefficient Wb Vis of the background layer of the visible light image by using the guide filter function is represented by formula (9):
WdVis=guiderfiler(Vis,PVis,kd,epsd) (8)
WbVis=guiderfiler(Vis,PVis,kb,epsb) (9)
the process of using the guide filter function to obtain the weight coefficient Wd IR of the detail layer of the infrared image is shown as formula (10), and the process of using the guide filter function to obtain the weight coefficient Wb IR of the background layer of the infrared image is shown as formula (11):
WdlR=guiderfiler(IR,PIR,kd,epsd) (10)
WbIR=guiderfiler(IR,PIR,kb,epsb) (11)
Wherein guiderfiler () in the formula represents a guide filter function, vis is a visible light gray image, IR is an infrared gray image, P Vis and P IR are guide image images of the visible light and the infrared images respectively obtained in the step 3, k b and k d are filters with different sizes, and eps b and eps d are blur coefficients with different sizes.
Further, in step 7, during addition, the weighting coefficients are further normalized to obtain a detail layer FUSION D and a background layer FUSION B of the fused image.
Further, in step 8, the fusion method is as follows:
the invention can realize the following beneficial effects:
(1) The method for fusing the visible light infrared images based on least square filtering introduces the concept of image contrast saliency, so that detail information with obvious contrast but low pixel intensity value in an image area is reserved; the invention combines the least square method and the guided filter to better maintain the edge characteristic, so that the fused image obtains more effective information. The invention has important significance for human visual perception, target detection and identification.
In the invention, the technical schemes can be mutually combined to realize more preferable combination schemes. Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, like reference numerals being used to refer to like parts throughout the several views.
FIG. 1 is a flow chart of a method for fusing visible infrared images according to an embodiment of the present invention;
FIG. 2 is a Sobel operator template diagram of an embodiment of the present invention;
FIG. 3 is a diagram of the original gray scale of visible light according to an embodiment of the present invention;
FIG. 4 is an infrared raw gray scale map of an embodiment of the present invention;
FIG. 5 is a high horizontal gradient and low vertical gradient image of visible light according to an embodiment of the present invention;
FIG. 6 is an infrared high horizontal gradient and low vertical gradient image of an embodiment of the present invention;
FIG. 7 is a contrast saliency image of an embodiment of the present invention;
FIG. 8 is a fused image using an image fusion method according to an embodiment of the present invention;
FIG. 9 is a fused image using a prior art pixel saliency-based guided filtered image fusion method;
fig. 10 is a prior art linear fused image.
Detailed Description
The following detailed description of preferred embodiments of the invention is made in connection with the accompanying drawings, which form a part hereof, and together with the description of the embodiments of the invention, are used to explain the principles of the invention and are not intended to limit the scope of the invention.
In describing embodiments of the present invention, it should be noted that, unless explicitly stated and limited otherwise, the term "coupled" should be interpreted broadly, for example, as being fixedly coupled, detachably or integrally coupled, mechanically or electrically coupled, directly coupled, or indirectly coupled via an intermediary. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
The terms "top," "bottom," "above," "below," and "above" are used throughout the description to refer to relative positions of components of the device, such as the relative positions of the top and bottom substrates inside the device. It will be appreciated that the devices are versatile, irrespective of their orientation in space.
In one embodiment of the present invention, as shown in fig. 1 to 10, a method for fusing visible light and infrared light based on least square filtering is disclosed, wherein the visible light and infrared light simultaneously transmit through a spherical primary mirror to form a visible light image and an infrared image, comprising the following steps:
Step 1, solving a detail layer image after least square filtering of a visible light gray level image:
Firstly, a least square filter is used to obtain an image VIS WLS of a visible light gray image filtered by the least square filter, and then a detail layer image of the visible light gray image filtered by the least square filter is obtained:
VISWLS-D=Vis-VisWLS (1)
Wherein, VIS WLS-D represents the detail layer after the least square method of the visible light gray scale map, and VIS represents the visible light gray scale map.
Step 2, solving a high horizontal gradient and a low vertical gradient chart for the input visible light and infrared gray level images:
The visible light gray level image and the infrared gray level image are simultaneously acquired through a visible light infrared composite system based on spherical concentric mirrors, so that the consistency of the acquired visible light image and infrared image is ensured.
The high horizontal gradient and the low vertical gradient of the visible light and infrared gray scale image are obtained by subtracting the gradient map in the y direction from the gradient map in the x direction (as shown in fig. 5 and 6). The operator for obtaining the gradient map uses a Sobel operator, and a Sobel operator template map is shown in fig. 2. The input visible light and infrared images are gray images.
gradientVis=gradXVis-gradYVis (2)
gradientIR=gradXIR-gradYIR (3)
Wherein gradien t Vis represents a high-level gradient and low-vertical gradient map of the visible light gray-scale image, gradien t IR represents a high-level gradient and low-vertical gradient map of the infrared gray-scale image, gradX Vis、grad YVis represents a gradient map of the visible light gray-scale image in X, Y directions, and gradX IR、grad YIR represents a gradient map of the infrared gray-scale image in X, Y directions.
Step 3, comparing the high-level gradient and low-vertical gradient images of the visible light and infrared gray level images, and obtaining a contrast saliency image I Statistical:
Each pixel point of the contrast saliency image takes the maximum value of the visible light, infrared light, high horizontal gradient and low vertical gradient image matrix, and a contrast saliency image I Statistical is obtained, as shown in fig. 7.
Where I Statistical,i,j represents a contrast salient value when the pixel position is (I, j), gradiennt Vis,i,j represents a value when the pixel position of the high-horizontal gradient and low-vertical gradient image of the visible light gray image is (I, j), and gradient IR,i,j represents a value when the pixel position of the high-horizontal gradient and low-vertical gradient image of the infrared gray image is (I, j).
Step 4, obtaining guide map images P Vis and P IR of the visible light image and the infrared image based on the contrast saliency image:
the contrast significant value image is respectively compared with a high-level gradient and low-vertical gradient image of visible light and infrared, the pixel point position guiding image value with equal value is 1, and the pixel point position guiding image value with unequal value is 0;
Wherein P Vis,i,j is the visible light guide map value when the pixel position is (i, j), and P IR,i,j is the infrared guide map value when the pixel position is (i, i).
Step 5, carrying out mean filtering on the input visible light and infrared gray level image to obtain a detail layer and a background layer:
Respectively carrying out mean filtering on the input visible light and infrared gray images to obtain a detail layer and a background layer, wherein the smoothness of the background layer is quite high, adopting a 31 multiplied by 31 mean filter to obtain the background layer, and calculating the detail layer according to the obtained background layer:
Vismean-d=Vis-Vismean-b (7)
IRmean-d=IR-IRmean-b (8)
Wherein Vis mean-d、IRmean-d is the detail layer of the visible light and infrared gray level image after mean value filtration, vis is the visible light gray level image, IR is the infrared gray level image, and Vis mean-b、 IRmean-b is the background layer of the visible light and infrared gray level image after mean value filtration.
Step 6, respectively performing guide filtering of different filters and different fuzzy coefficients on the input visible light and infrared gray level images according to the obtained guide image images P Vis and P IR to obtain a visible light and infrared image detail layer weight coefficient Wd Vis、WdIR and a background layer weight coefficient Wb Vis、WbIR:
the process of obtaining the weight coefficient Wd Vis of the detail layer of the visible light image using the guide filter function may be expressed as formula (9), and the process of obtaining the weight coefficient Wb Vis of the background layer of the visible light image using the guide filter function may be expressed as formula (10):
WdVis=guiderfiler(Vis,PVis,kd,epsd) (9)
WbVis=guiderfiler(Vis,PVis,kb,epsb) (10)
The process of obtaining the weight coefficient Wd IR of the detail layer of the infrared image using the guide filter function may be expressed as formula (11), and the process of obtaining the weight coefficient Wb IR of the background layer of the infrared image using the guide filter function may be expressed as formula (12):
WdIR=guiderfiler(IR,PIR,kd,epsd) (11)
WbIR=guiderfiler(IR,PIR,kb,epsb) (12)
Wherein guiderfiler () in the formula represents a guide filter function, vis is a visible light gray image, IR is an infrared gray image, P Vis and P IR are guide image images of the visible light and the infrared images respectively obtained in the step 3, a larger filter k b and a larger blur coefficient eps b are adopted when calculating a background layer, a smaller filter k d and a smaller blur coefficient eps d are adopted when calculating a detail layer, the filter k b adopted when calculating the background layer is twice the filter k d adopted when calculating the detail layer, and the blur coefficient eps b when calculating the background layer is different from the blur coefficient eps d of the detail layer by one order of magnitude.
Further, the principle of guided filtering is as follows:
Wherein I i is a guide image pixel, q i is an output image pixel, q can be regarded as a local linear transformation of the guide image I, k is a midpoint of a localized window, and therefore, pixels belonging to the window ω k can be all calculated by transforming the corresponding pixels of the guide image by the coefficients of (a k,bk).
The loss function within the filter window can be written as:
Wherein, p i is the pixel point of the input image, and regularization parameter E is introduced to avoid excessive ak.
Solving the above equation yields:
wherein the upper dash represents the average of all such values calculated in the window.
Wherein the method comprises the steps ofIs the variance of the window:
Illustratively, in this embodiment, the window radius of the filter used in calculating the weight map of the background layer is 8, and the blur coefficient is 0.3×0.3. When the detail layer weight map is calculated, the window radius of the filter is 4, and the fuzzy coefficient is 0.03 x 0.03.
And 7, weighting the detail layer weight coefficient Wd Vis、WdIR and the weight coefficient Wb Vis、WbIR of the background layer obtained in the step 6 with the visible light, infrared detail layer and the background layer to obtain a detail layer FUSIO N D and a background layer FUSIO N B of the fused image.
When adding, the weighting coefficients are further normalized to obtain a detail layer FUSIO N D and a background layer FUSIO N B of the fused image, as shown in formula (13) and formula (14):
FUSIOND=Vismean-d*WdVis/(WdVis+WdIR)+IRmean-d×WdIR/(WdVis+WdIR) (13)
FUSIONB=Vismean-b*WbVis/(WbVis+WbIR)+IRmean-b*WbIR/(WbVis+WbIR) (14)
Step 8, obtaining a detail layer average value for the visible light least square detail layer obtained in the step 1 and the fusion result detail layer obtained in the step 7, wherein the fusion method is shown in the following formula:
And 9, fusing the fusion result background layer obtained in the step 7 and the detail layer average value obtained in the step 8 to obtain a final fusion result image, wherein the fusion method is shown in the following formula:
FUSION=DETAILD+FUSIONB (16)
As shown in the figure, compared with the image fused by the image fusion method based on pixel saliency (as shown in fig. 9) and the image fused by the image fusion method based on pixel saliency (as shown in fig. 10), the image fused by the image fusion method based on pixel saliency (as shown in fig. 8) retains the detail part with obvious contrast but low pixel intensity value, so that the image edge is clearer, and the fused image obtains more effective information.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention.

Claims (10)

1.一种基于最小二乘滤波的可见光红外图像融合方法,其特征在于,包括以下步骤:1. A visible light infrared image fusion method based on least squares filtering, characterized by comprising the following steps: 步骤1、求取可见光灰度图像的最小二乘滤波后的细节层图像;Step 1, obtaining the detail layer image after least square filtering of the visible light grayscale image; 步骤2、对输入的可见光、红外灰度图像求取高水平梯度且低垂直梯度图;Step 2: Obtain a high horizontal gradient and low vertical gradient map for the input visible light and infrared grayscale images; 步骤3、比较可见光、红外灰度图像的高水平梯度且低垂直梯度图,求取对比度显著性图像IStatisticalStep 3: Compare the high horizontal gradient and low vertical gradient images of the visible light and infrared grayscale images to obtain the contrast saliency image I Statistical ; 步骤4、基于对比度显著性图像求取可将光图像和红外图像的导向图图像PVis和PIRStep 4: Obtain the guidance graph images P Vis and P IR that can separate the light image and the infrared image based on the contrast saliency image; 步骤5、对输入的可见光及红外灰度图像进行均值滤波以获取细节层和背景层;Step 5: Perform mean filtering on the input visible light and infrared grayscale images to obtain detail layers and background layers; 步骤6、对输入的可见光、红外灰度图像根据求得的导向图图像PVis和PIR分别进行不同滤波器和不同模糊系数的导向滤波,获得可见光和红外图像细节层权重系数WdVis、WdIR和背景层的权重系数WbVis、WbIRStep 6: Perform guided filtering using different filters and different fuzzy coefficients on the input visible light and infrared grayscale images according to the obtained guided graph images P Vis and P IR, respectively, to obtain weight coefficients Wd Vis and Wd IR for the detail layer of the visible light and infrared images and weight coefficients Wb Vis and Wb IR for the background layer; 步骤7、分别将步骤5中求得的细节层权重系数WdVis、WdIR和背景层的权重系数WbVis、WbIR与可见光、红外的细节层、背景层加权后得到融合后图像的细节层FUSIOND和背景层FUSIONBStep 7: The detail layer weight coefficients Wd Vis and Wd IR and the background layer weight coefficients Wb Vis and Wb IR obtained in step 5 are weighted with the visible light detail layer and infrared background layer to obtain the detail layer FUSION D and the background layer FUSION B of the fused image; 步骤8、对步骤1所获得的可见光最小二乘细节层和步骤7所获得的融合结果细节层获取细节层平均值;Step 8: Obtain an average value of the detail layer of the visible light least squares detail layer obtained in step 1 and the fusion result detail layer obtained in step 7; 步骤9、对步骤7中获得的融合结果背景层和步骤8中获取的细节层平均值进行融合,得到最终融合结果图像。Step 9: Fuse the background layer of the fusion result obtained in step 7 and the average value of the detail layer obtained in step 8 to obtain the final fusion result image. 2.根据权利要求1所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤1中,首先使用最小二乘滤波器求取可见光灰度图像使用最小二乘滤波器滤波后的图像VISWLS,然后求取可见光灰度图像的最小二乘滤波后的细节层图像VISWLS-D2. The visible-light infrared image fusion method based on least squares filtering according to claim 1 is characterized in that in step 1, a least squares filter is first used to obtain an image VIS WLS after the visible light grayscale image is filtered using the least squares filter, and then a detail layer image VIS WLS-D after the least squares filtering of the visible light grayscale image is obtained. 3.根据权利要求1所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤2中,使用x方向的梯度图减去y方向的梯度图,分别求取可见光、红外灰度图像的高水平梯度且低垂直梯度图。3. The visible light and infrared image fusion method based on least squares filtering according to claim 1 is characterized in that in step 2, the high horizontal gradient and low vertical gradient maps of the visible light and infrared grayscale images are obtained respectively by subtracting the gradient map in the y direction from the gradient map in the x direction. 4.根据权利要求3所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤2中,求取梯度图的算子使用Sobel算子。4. The visible light infrared image fusion method based on least squares filtering according to claim 3, characterized in that in step 2, the operator for obtaining the gradient map uses a Sobel operator. 5.根据权利要求2所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤3中,对比度显著性图像的每个像素点取可见光、红外这两个高水平梯度且低垂直梯度图矩阵中的最大值,得到对比度显著性图像IStatistical5. The method for visible-light and infrared image fusion based on least squares filtering according to claim 2, characterized in that in step 3, for each pixel of the contrast saliency image, the maximum value of the two matrices of visible light and infrared images with high horizontal gradients and low vertical gradients is taken to obtain the contrast saliency image I Statistical . 6.根据权利要求5所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤4中,可见光图像和红外图像的导向图图像PVis和PIR求取方法为:将对比度显著值图像分别与可见光及红外的高水平梯度且低垂直梯度图比较,值相等的像素点位置导向图值为1,值不等的像素点位置导向图值为0。6. The visible light and infrared image fusion method based on least squares filtering according to claim 5 is characterized in that in step 4, the method for obtaining the guidance map images P Vis and P IR of the visible light image and the infrared image is: the contrast saliency value image is compared with the high horizontal gradient and low vertical gradient maps of the visible light and infrared respectively, and the guidance map value of the pixel position with equal value is 1, and the guidance map value of the pixel position with unequal value is 0. 7.根据权利要求6所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤5中,对输入的可见光、红外灰度图像分别进行均值滤波获取其细节层和背景层,其中背景层平滑程度很大,采用31×31的均值滤波器,以获得背景层,并根据获得的背景层计算细节层:7. The visible-light and infrared image fusion method based on least squares filtering according to claim 6 is characterized in that, in step 5, the input visible-light and infrared grayscale images are respectively subjected to mean filtering to obtain their detail layer and background layer, wherein the background layer is very smooth, and a 31×31 mean filter is used to obtain the background layer, and the detail layer is calculated based on the obtained background layer: Vismean-d=Vis-Vismean-b (6)Vis mean-d = Vis-Vis mean-b (6) IRmean-d=IR-IRmean-b (7)IR mean-d = IR-IR mean-b (7) 其中,Vismean-d、IRmean-d分别为均值滤波后的可见光、红外灰度图像的细节层,Vis为可见光灰度图像,IR红外灰度图像,Vismean-b、IRmean-b分别为均值滤波后的可见光、红外灰度图像的背景层。Among them, Vis mean-d and IR mean-d are the detail layers of the visible light and infrared grayscale images after mean filtering, respectively. Vis is the visible light grayscale image and IR is the infrared grayscale image. Vis mean-b and IR mean-b are the background layers of the visible light and infrared grayscale images after mean filtering, respectively. 8.根据权利要求7所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤6中,利用导向滤波函数求得可见光图像的细节层的权重系数WdVis的过程为公式(8);利用导向滤波函数求得可见光图像的背景层的权重系数WbVis的过程为公式(9):8. The visible light infrared image fusion method based on least squares filtering according to claim 7 is characterized in that in step 6, the process of using the guided filter function to obtain the weight coefficient Wd Vis of the detail layer of the visible light image is as follows: Formula (8); the process of using the guided filter function to obtain the weight coefficient Wb Vis of the background layer of the visible light image is as follows: WdVis=guiderfiler(Vis,PVis,kd,epsd) (8)Wd Vis = guiderfiler (Vis, P Vis , k d , eps d ) (8) WbVis=guiderfiler(Vis,PVis,kb,epsb) (9)Wb Vis = guiderfiler (Vis, P Vis , k b , eps b ) (9) 利用导向滤波函数求得红外图像的细节层的权重系数WdIR的过程为公式(10);利用导向滤波函数求得红外图像的背景层的权重系数WbIR的过程为公式(11):The process of obtaining the weight coefficient Wd IR of the detail layer of the infrared image using the guided filter function is as follows: Formula (10); The process of obtaining the weight coefficient Wb IR of the background layer of the infrared image using the guided filter function is as follows: WdIR=guiderfiler(IR,PIR,kd,epsd) (10)Wd IR = guiderfiler (IR, P IR , k d , eps d ) (10) WbIR=guiderfiler(IR,PIR,kb,epsb) (11)Wb IR = guiderfiler (IR, P IR , k b , eps b ) (11) 其中,公式中guiderfiler()表示导向滤波函数;Vis为可见光灰度图像;IR为红外灰度图像;PVis和PIR为步骤3中分别求取的可见光、红外图像的导向图图像;kb和kd为不同大小的滤波器;epsb和epsd为不同大小的模糊系数。In the formula, guiderfiler() represents the guided filter function; Vis is the visible light grayscale image; IR is the infrared grayscale image; P Vis and P IR are the guided graph images of the visible light and infrared images respectively obtained in step 3; k b and k d are filters of different sizes; eps b and eps d are fuzzy coefficients of different sizes. 9.根据权利要求8所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤7中,在相加时,还需对权重系数进行归一化处理,得到融合后图像的细节层FUSIOND和背景层FUSIONB9. The visible-light infrared image fusion method based on least squares filtering according to claim 8, characterized in that in step 7, during the addition, the weight coefficients are normalized to obtain the detail layer FUSION D and the background layer FUSION B of the fused image. 10.根据权利要求9所述的基于最小二乘滤波的可见光红外图像融合方法,其特征在于,步骤8中,融合方法为:10. The visible light infrared image fusion method based on least squares filtering according to claim 9, characterized in that in step 8, the fusion method is:
CN202210109068.8A 2022-01-28 2022-01-28 A visible light infrared image fusion method based on least squares filtering Active CN116563126B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210109068.8A CN116563126B (en) 2022-01-28 2022-01-28 A visible light infrared image fusion method based on least squares filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210109068.8A CN116563126B (en) 2022-01-28 2022-01-28 A visible light infrared image fusion method based on least squares filtering

Publications (2)

Publication Number Publication Date
CN116563126A CN116563126A (en) 2023-08-08
CN116563126B true CN116563126B (en) 2025-10-28

Family

ID=87502387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210109068.8A Active CN116563126B (en) 2022-01-28 2022-01-28 A visible light infrared image fusion method based on least squares filtering

Country Status (1)

Country Link
CN (1) CN116563126B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419212A (en) * 2020-10-15 2021-02-26 卡乐微视科技(云南)有限公司 Infrared and visible light image fusion method based on side window guide filtering
CN112884690A (en) * 2021-02-26 2021-06-01 中国科学院西安光学精密机械研究所 Infrared and visible light image fusion method based on three-scale decomposition

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111062905B (en) * 2019-12-17 2022-01-04 大连理工大学 An infrared and visible light fusion method based on saliency map enhancement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419212A (en) * 2020-10-15 2021-02-26 卡乐微视科技(云南)有限公司 Infrared and visible light image fusion method based on side window guide filtering
CN112884690A (en) * 2021-02-26 2021-06-01 中国科学院西安光学精密机械研究所 Infrared and visible light image fusion method based on three-scale decomposition

Also Published As

Publication number Publication date
CN116563126A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
Wang et al. AIPNet: Image-to-image single image dehazing with atmospheric illumination prior
Singh et al. Comprehensive survey on haze removal techniques
CN112184604B (en) Color image enhancement method based on image fusion
CN107844750A (en) A kind of water surface panoramic picture target detection recognition methods
CN107169947B (en) Image fusion experimental method based on feature point positioning and edge detection
CN111582074A (en) Monitoring video leaf occlusion detection method based on scene depth information perception
CN109410161B (en) Fusion method of infrared polarization images based on YUV and multi-feature separation
CN109166089A (en) The method that a kind of pair of multispectral image and full-colour image are merged
CN115689960A (en) A fusion method of infrared and visible light images based on adaptive illumination in nighttime scenes
CN109410160B (en) Infrared polarization image fusion method based on multi-feature and feature difference driving
CN111383352B (en) An automatic coloring and abstraction method for three-level Rubik's Cube
CN116563126B (en) A visible light infrared image fusion method based on least squares filtering
CN109377468A (en) The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features
CN112734679A (en) Fusion defogging method for medical operation video images
CN116563125B (en) A visible-infrared image fusion method based on contrast saliency
CN116342446A (en) Multi-focus image fusion method and device, electronic equipment, storage medium
CN113920455B (en) Night video coloring method based on deep neural network
Petrovic Multilevel image fusion
CN113592849A (en) External insulation equipment fault diagnosis method based on convolutional neural network and ultraviolet image
CN116563127A (en) Visible light infrared image fusion device based on contrast saliency
CN113191991A (en) Multi-modal image fusion method, system, device and medium based on information bottleneck
CN107301625A (en) Image defogging algorithm based on brightness UNE
Pavethra et al. A cross layer graphical neural network based convolutional neural network framework for image dehazing
Lebbad et al. A Bayesian algorithm for vision based navigation of autonomous surface vehicles
CN118644418A (en) A road condition detection method for reducing ambient light interference

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant