[go: up one dir, main page]

CN107169942A - A kind of underwater picture Enhancement Method based on fish retinal mechanisms - Google Patents

A kind of underwater picture Enhancement Method based on fish retinal mechanisms Download PDF

Info

Publication number
CN107169942A
CN107169942A CN201710573257.XA CN201710573257A CN107169942A CN 107169942 A CN107169942 A CN 107169942A CN 201710573257 A CN201710573257 A CN 201710573257A CN 107169942 A CN107169942 A CN 107169942A
Authority
CN
China
Prior art keywords
mrow
msub
channel
msubsup
mfrac
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710573257.XA
Other languages
Chinese (zh)
Other versions
CN107169942B (en
Inventor
李永杰
张明
赵乾
高绍兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201710573257.XA priority Critical patent/CN107169942B/en
Publication of CN107169942A publication Critical patent/CN107169942A/en
Application granted granted Critical
Publication of CN107169942B publication Critical patent/CN107169942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

本发明公开了一种基于鱼类视网膜机制的水下图像增强方法,本发明的方法模拟了鱼类视网膜水平细胞与视锥细胞之间的反馈关系来去除水下图像的色偏,模拟了鱼类视网膜双极细胞中心外周拮抗作用来去除水下图像的模糊。在整个模拟过程中,模拟了鱼类视网膜水平细胞侧抑制双极细胞的结构来设计双极细胞感受野的双高斯差滤波器;同时利用sigmoid曲线模拟了网间细胞在黑暗中持续释放多巴胺调节水平细胞的活动,使得处理后的图像更加符合鱼类的视觉机制;最后采用gamma变换模拟了无长突细胞对亮度信息的非线性处理,并构成颜色双极细胞的中心输入。

The invention discloses an underwater image enhancement method based on the fish retina mechanism. The method of the invention simulates the feedback relationship between the fish retina horizontal cells and the cone cells to remove the color cast of the underwater image, and simulates the fish Center-periphery antagonism of retinal bipolar cells to deblur underwater images. During the whole simulation process, the structure of bipolar cells inhibited by the horizontal cell side of the fish retina was simulated to design the double Gaussian difference filter of the receptive field of bipolar cells; at the same time, the sigmoid curve was used to simulate the continuous release of dopamine regulation by reticulum cells in the dark The activities of horizontal cells make the processed image more in line with the visual mechanism of fish; finally, gamma transformation is used to simulate the nonlinear processing of brightness information by amacrine cells, and constitute the central input of color bipolar cells.

Description

一种基于鱼类视网膜机制的水下图像增强方法An underwater image enhancement method based on fish retinal mechanism

技术领域technical field

本发明属于图像处理技术领域,涉及彩色图像增强技术,具体涉及一种基于鱼类视网膜机制的水下图像增强方法。The invention belongs to the technical field of image processing and relates to a color image enhancement technology, in particular to an underwater image enhancement method based on a fish retina mechanism.

背景技术Background technique

随着人类探索能力的不断增强,越来越多的水下图像被广泛传播与应用。但是由于水体中悬浮颗粒的后向散射以及前向散射会导致图像模糊,由于光射入水中后不同波长的光波衰减速度不同而导致水下图像颜色具有蓝绿色偏。图像模糊及色偏都会使得我们最终获得的水下图像不够清晰。因此,如何去除模糊和色偏的影响,获得具有高对比度的水下图像成了一个比较重要的问题。With the continuous enhancement of human exploration capabilities, more and more underwater images have been widely disseminated and applied. However, due to the backscattering and forward scattering of suspended particles in the water body, the image will be blurred, and the color of the underwater image will have a blue-green cast due to the different attenuation speeds of light waves of different wavelengths after the light enters the water. Image blur and color cast will make the final underwater image not clear enough. Therefore, how to remove the influence of blur and color cast and obtain underwater images with high contrast has become a more important issue.

现有的图像去模糊方法主要为基于暗通道先验假设的方法,该类方法总体上都是基于大气散射物理模型,代表方法有Chiang J Y等于2012年提出的方法,参考文献:ChiangJ Y,Chen Y C.Underwater image enhancement by wavelength compensation anddehazing[J].IEEE Transactions on Image Processing,2012,21(4):1756-1769。它们都需要满足暗通道先验才能实现很好的去模糊效果。Existing image deblurring methods are mainly methods based on dark channel prior assumptions. These methods are generally based on atmospheric scattering physical models. Representative methods include the method proposed by Chiang J Y and others in 2012. References: ChiangJ Y, Chen Y C. Underwater image enhancement by wavelength compensation and dehazing [J]. IEEE Transactions on Image Processing, 2012, 21(4): 1756-1769. They all need to satisfy the dark channel prior to achieve a good deblurring effect.

颜色恒常的方法有基于学习的或静态的颜色恒常方法,主要通过估计场景光源颜色还原物体的真实颜色。然而它们主要是针对陆地场景,忽略了水下图像的一些特性。其中,基于学习的方法由于目前没有一个拥有标准光源的水下图像数据库,所以应用于水下图像处理仍有一定难度;而静态方法大都基于一定的灰度假设,但一般水下图像红色波段明显弱于其他波段,不满足灰度假设,所以静态方法校正出来的图像颜色大都偏红,即使有些改进后的静态方法,也要满足暗通道先验模型才能有一定效果。鱼类视网膜可以同时解决色偏和模糊问题,然而目前还没有一种模拟鱼类视网膜同时处理色偏和模糊的方法。Color constancy methods include learning-based or static color constancy methods, which mainly restore the true color of objects by estimating the color of the scene light source. However, they are mainly aimed at land scenes, ignoring some characteristics of underwater images. Among them, the learning-based method is still difficult to apply to underwater image processing because there is currently no underwater image database with a standard light source; while the static method is mostly based on certain grayscale assumptions, but generally the red band of underwater images is obvious. It is weaker than other bands and does not meet the grayscale assumption, so the color of the image corrected by the static method is mostly reddish. Even some improved static methods must meet the dark channel prior model to have a certain effect. Fish retina can solve the problem of color cast and blur at the same time, but there is no method to simulate fish retina to deal with color cast and blur at the same time.

发明内容Contents of the invention

针对现有技术存在上述问题,本发明提出了一种基于鱼类视网膜机制的水下图像增强方法。In view of the above-mentioned problems in the prior art, the present invention proposes an underwater image enhancement method based on the fish retina mechanism.

本发明的技术方案为:一种基于鱼类视网膜机制的水下图像增强方法,包括以下步骤:The technical scheme of the present invention is: a kind of underwater image enhancement method based on fish retinal mechanism, comprises the following steps:

S1、提取颜色分量以及亮度分量:对水下图像每一个像素点分别提取红色分量IR、绿色分量IG、蓝色分量IB,并计算出平均亮度分量I:S1. Extract color components and brightness components: extract the red component I R , green component I G , and blue component I B for each pixel of the underwater image, and calculate the average brightness component I:

I=(IR+IG+IB)/3I=(I R +I G +I B )/3

S2、计算RGB三通道的调整后均值:计算红色通道中像素值大于第一阈值的最亮的部分的像素点的均值Mr,作为红色通道调整后的均值,并计算绿色通道和蓝色通道各自的均值Mg、MbS2. Calculate the adjusted mean value of the three RGB channels: calculate the mean value M r of the pixels in the brightest part of the red channel whose pixel value is greater than the first threshold, as the adjusted mean value of the red channel, and calculate the green channel and the blue channel Respective mean values M g , M b ;

S3、校正图像的色偏:将R、G、B三通道每一像素点与其对应的均值相除,处理完成得到各通道更新后的值I′R、I′G、I′B,具体计算公式为;S3. Correct the color shift of the image: divide each pixel of the three channels R, G, and B by its corresponding mean value, and obtain the updated values I′ R , I′ G , and I′ B of each channel after the processing is completed, and calculate in detail The formula is;

然后,将更新后的值拉伸至原始图像亮度,具体计算公式为:Then, stretch the updated value to the original image brightness, the specific calculation formula is:

其中,I′表示由I′R、I′G、I′B组成的图像;mean表示求图像的均值。Among them, I' represents the image composed of I' R , I' G , and I'B; mean represents the mean value of the image.

S4、计算颜色通道与亮度通道感受野外周输入:对步骤S1得到的亮度分量I以及步骤S3得到R、G、B三通道更新后的值(I″R、I″G、I″B)分别进行滤波,得到四个通道的感受野外周输入fsI、fsR、fsG、fsBS4, calculate color channel and luminance channel to experience field peripheral input: obtain R, G, the value (I″ R , I″ G , I″ B ) of the luminance component I that step S1 obtains and step S3 three channels update (I″ R , I″ G , I″ B ) respectively Perform filtering to obtain the peripheral input f sI , f sR , f sG , f sB of the four channels;

S5、计算亮度通道感受野中心输入:S5. Calculate the input of the receptive field center of the brightness channel:

计算步骤S1得到的亮度通道I的均值M,若M小于第二阈值,则亮度通道感受野的中心输入fcI为采用sigmoid函数调节,同时,将步骤S3得到的(I″R、I″G、I″B)也采用sigmoid进行再次更新;否则令fcI=I,且(I″R、I″G、I″B)不作更新处理;Calculate the mean value M of the luminance channel I obtained in step S1, if M is less than the second threshold, then the center input f cI of the receptive field of the luminance channel is adjusted by a sigmoid function, and meanwhile, the (I″ R , I″ G obtained in step S3 , I″ B ) also use sigmoid to update again; otherwise let f cI =I, and (I″ R , I″ G , I″ B ) do not update;

S6、计算颜色通道与亮度通道感受野外周所占权重:使用k表示RGB通道与亮度通道感受野外周权重,其计算公式为:S6. Calculate the weight of the color channel and the brightness channel to experience the field week: use k to represent the weight of the RGB channel and the brightness channel to experience the field week, and the calculation formula is:

其中,λ表示R、G、B三个通道,A为每个通道对应的最大值。I″λ(x,y)为经步骤S5处理后I″R、I″G、I″B对应(x,y)位置的像素值,kMAX为k值上限。Among them, λ represents the three channels of R, G, and B, and A is the maximum value corresponding to each channel. I″ λ (x, y) is the pixel value corresponding to the (x, y) position of I″ R , I″ G , and I″ B after the processing in step S5, and k MAX is the upper limit of k value.

S7、计算亮度通道感受野响应:将步骤S4与S5计算得到的感受野中心和外周输入fcI与fsI代入双高斯差函数,计算得到亮度通道的感受野响应值,具体计算公式为:S7. Calculate the receptive field response of the luminance channel: Substitute the receptive field center and peripheral inputs f cI and f sI calculated in steps S4 and S5 into the double Gaussian difference function to calculate the receptive field response value of the luminance channel. The specific calculation formula is:

其中,表示卷积,fcI(x,y)、fsI(x,y)表示图像中点(x,y)的感受野中心和外周输入,g(m,n;σc)、g(m,n;σs)表示大小为m*n的二维高斯函数,rodBp即为亮度通道的感受野输出结果。in, Represents convolution, f cI (x, y), f sI (x, y) represent the receptive field center and peripheral input of the point (x, y) in the image, g(m, n; σ c ), g(m, n; σ s ) represents a two-dimensional Gaussian function with a size of m*n, and roddB p is the output result of the receptive field of the luminance channel.

S8、计算RGB三通道感受野的中心输入:对步骤S7得到的亮度通道感受野输出rodBp进行gamma变换得到rodBp γ,并与经步骤S5处理得到的I″R、I″G、I″B相乘共同构成R、G、B三通道的感受野中心输入fc,具体计算公式为:S8. Calculate the central input of the RGB three-channel receptive field: perform gamma transformation on the luminance channel receptive field output roddB p obtained in step S7 to obtain roddB p γ , and compare it with the I″ R , I″ G , and I″ obtained through the processing of step S5 Multiplying B together constitutes the input f c of the receptive field center of the three channels R, G, and B. The specific calculation formula is:

fcR=I″R*rodBp γ f cR =I″ R *rodB p γ

fcG=I″G*rodBp γ f cG =I″ G *rodB p γ

fcB=I″B*rodBp γ f cB =I″ B *rodB p γ

其中,*表示乘号;Among them, * represents the multiplication sign;

S9、计算RGB三通道感受野响应并输出:同步骤S7,将步骤S5与S8计算得到R、G、B三通道的感受野中心输入fcR、fcG、fcB以及外周输入fsR、fsG、fsB代入双高斯差函数来计算R、G、B三通道的感受野响应,具体计算公式为:S9. Calculate and output RGB three-channel receptive field response: Same as step S7, calculate the receptive field center input f cR , f cG , f cB and peripheral input f sR , f of R, G, and B channels in steps S5 and S8 Substituting sG and f sB into the double Gaussian difference function to calculate the receptive field response of the R, G, and B channels, the specific calculation formula is:

R、G、B三个通道的感受野响应BpR、BpG、BpB即为三个通道增强后的去雾图像,将三个通道重新组合成一幅RGB图像,作为最终的输出。The receptive field responses B pR , B pG , and B pB of the three channels R, G, and B are the enhanced dehazed images of the three channels, and the three channels are recombined into an RGB image as the final output.

进一步的,步骤S2所述的第一阈值为0.1,步骤S5所述的第二阈值为0.5。Further, the first threshold in step S2 is 0.1, and the second threshold in step S5 is 0.5.

进一步的,步骤S2所述的最亮的部分像素点具体为最亮的50%的像素点。Further, the brightest part of pixels mentioned in step S2 is specifically 50% of the brightest pixels.

进一步的,步骤S4所述的滤波具体为均值滤波。Further, the filtering described in step S4 is specifically mean filtering.

进一步的,所述步骤S2中,为避免调整后红色通道均值过高,当调整后红色通道均值Mr大于绿色通道的均值Mg时,用绿色通道的均值Mg作为红色通道最终调整后的均值,即:Further, in the step S2, in order to avoid the adjusted mean value of the red channel being too high, when the adjusted red channel mean value Mr is greater than the green channel mean value Mg , the green channel mean value Mg is used as the red channel after final adjustment. mean, that is:

Mr=min(Mr,Mg)。M r =min(M r , M g ).

进一步的,步骤S5对亮度通道感受野的中心输入fcI采用sigmoid函数调节具体如下:Further, in step S5, the central input f cI of the receptive field of the luminance channel is adjusted using a sigmoid function as follows:

更进一步的,步骤S4所述的均值滤波器的窗宽大小为大于3*3,小于15*15的任意大小,例如7*7,9*9等。Furthermore, the window width of the mean filter described in step S4 is any size larger than 3*3 and smaller than 15*15, such as 7*7, 9*9 and so on.

进一步的,步骤S7与步骤S9中所述的感受野中心和外周的高斯函数具体为:Further, the Gaussian functions of the receptive field center and the periphery described in step S7 and step S9 are specifically:

其中,δc的取值范围具体为0.2~0.8,δs取值为δc的3倍,m,n的取值范围具体为5~15的整数。Wherein, the value range of δ c is specifically 0.2-0.8, the value of δ s is 3 times of δ c , and the value range of m, n is specifically an integer of 5-15.

进一步的,步骤S8中所述γ的取值范围具体为0.4~0.6。Further, the value range of γ in step S8 is specifically 0.4-0.6.

本发明的有益效果是:本发明的方法模拟了鱼类视网膜水平细胞与视锥细胞之间的反馈关系来去除水下图像的色偏,模拟了鱼类视网膜双极细胞中心外周拮抗作用来去除水下图像的模糊。在整个模拟过程中,模拟了鱼类视网膜水平细胞侧抑制双极细胞的结构来设计双极细胞感受野的双高斯差滤波器;同时利用sigmoid曲线模拟了网间细胞在黑暗中持续释放多巴胺调节水平细胞的活动,使得处理后的图像更加符合鱼类的视觉机制;最后采用gamma变换模拟了无长突细胞对亮度信息的非线性处理,并构成颜色双极细胞的中心输入。基于本发明的算法可内嵌于相机内部,作为一种水下模式来处理水下图像的色偏以及模糊问题。The beneficial effects of the present invention are: the method of the present invention simulates the feedback relationship between fish retinal horizontal cells and cone cells to remove the color cast of underwater images, and simulates the central peripheral antagonism of fish retinal bipolar cells to remove Blurring of underwater images. During the whole simulation process, the structure of bipolar cells inhibited by the horizontal cell side of the fish retina was simulated to design the double Gaussian difference filter of the receptive field of bipolar cells; at the same time, the sigmoid curve was used to simulate the continuous release of dopamine regulation by reticulum cells in the dark The activities of horizontal cells make the processed image more in line with the visual mechanism of fish; finally, gamma transformation is used to simulate the nonlinear processing of brightness information by amacrine cells, and constitute the central input of color bipolar cells. The algorithm based on the invention can be embedded in the camera as an underwater mode to deal with the color shift and blur problems of underwater images.

附图说明Description of drawings

图1为本发明实施例的水下图像处理流程图。FIG. 1 is a flowchart of underwater image processing according to an embodiment of the present invention.

图2为水下拍摄的具有色偏及模糊问题的原始图像。Figure 2 is the original image taken underwater with color cast and blur problems.

图3为原始图像去除色偏之后对应的结果。Figure 3 is the corresponding result after removing the color shift from the original image.

图4为原始图像经过两次更新后对应的图像。Figure 4 shows the corresponding image after the original image has been updated twice.

图5为亮度通道感受野的响应图像。Figure 5 is the response image of the receptive field of the luma channel.

图6为最终输出去除色偏以及模糊的图像。Figure 6 is the final output image with color cast and blur removed.

具体实施方式detailed description

下面结合附图对本发明的实施例作进一步的说明。Embodiments of the present invention will be further described below in conjunction with the accompanying drawings.

鱼类对于水下图像中的色偏以及模糊具有很强的适应能力,学习鱼类视觉系统图像处理过程有助于解决相机拍摄到的水下图像中的色偏以及模糊等问题。基于此,本发明提出了一种基于鱼类视网膜机制的水下图像增强方法,如图1所示,包括以下步骤:Fish have strong adaptability to color cast and blur in underwater images. Learning the image processing process of fish visual system can help solve the problems of color cast and blur in underwater images captured by cameras. Based on this, the present invention proposes an underwater image enhancement method based on the fish retina mechanism, as shown in Figure 1, comprising the following steps:

对于一幅图像大小为768*1024,具有色偏以及模糊问题的水下图像(如图2所示),本发明的详细步骤流程如下所示:Be 768*1024 for an image size, have the underwater image (as shown in Figure 2) of color cast and fuzzy problem, the detailed step flow process of the present invention is as follows:

S1、提取颜色分量以及亮度分量:对水下图像每一个像素点分别提取红色分量IR、绿色分量IG、蓝色分量IB,并计算出平均亮度分量I。S1. Extract color components and brightness components: extract red component I R , green component I G , and blue component I B for each pixel of the underwater image, and calculate the average brightness component I.

以原始输入图像(图2)中像素值为(0.659,0.718,0.463)的示例点1和像素值为(0.275,0.373,0.212)的示例点2为例。它们对应的平均亮度分量I分别为(0.659+0.718+0.463)/3=0.613和(0.275+0.373+0.212)/3=0.286。Take example point 1 with pixel values (0.659, 0.718, 0.463) and example point 2 with pixel values (0.275, 0.373, 0.212) in the original input image (Fig. 2). Their corresponding average brightness components I are (0.659+0.718+0.463)/3=0.613 and (0.275+0.373+0.212)/3=0.286, respectively.

S2、计算RGB三通道的调整后均值:分别计算三个通道的均值,其中R通道计算大于0.1的最亮的50%的像素。原始图像(图2)中R通道像素值大于0.1的前50%像素点的均值为0.4231,所以Mr=0.4231,G通道均值为0.5407,所以Mg=0.5407,B通道均值为0.3367,所以Mb=0.3367。由于R通道的均值比G通道的均值要小,所以R通道的均值Mr不作改变,仍为0.4231。S2. Calculating the adjusted average of the three RGB channels: calculating the average of the three channels respectively, wherein the R channel calculates the brightest 50% of the pixels greater than 0.1. In the original image (Fig. 2), the mean value of the first 50% pixels with the R channel pixel value greater than 0.1 is 0.4231, so M r =0.4231, the G channel mean value is 0.5407, so M g =0.5407, and the B channel mean value is 0.3367, so M b = 0.3367. Since the mean value of the R channel is smaller than the mean value of the G channel, the mean value M r of the R channel remains unchanged at 0.4231.

需要说明的是,不同于计算R通道的均值,这里采用通用方法计算G、B通道的均值。It should be noted that, unlike calculating the average value of the R channel, here a general method is used to calculate the average value of the G and B channels.

S3、校正图像的色偏:将R、G、B三通道每一像素点与其对应的均值相除,处理完成得到各通道更新后的值I′R、I′G、I′B,通过更新,图像色偏被去除。S3. Correct the color shift of the image: divide each pixel of the three channels R, G, and B by its corresponding mean value, and obtain the updated values I′ R , I′ G , and I′ B of each channel after the processing is completed, and update them , the image color cast is removed.

将两个示例像素点对应通道的像素除以均值后,得到更新之后的像素值I′R、I′G、I′B分别为(0.659/0.4231,0.718/0.5407,0.463/0.3367)=(1.5576,1.3279,1.3751)和(0.275/0.4231,0.373/0.5407,0.212/0.3367)=(0.6500,0.6898,0.6296)。After dividing the pixels of the channels corresponding to the two example pixels by the mean value, the updated pixel values I′ R , I′ G , and I′ B are respectively (0.659/0.4231, 0.718/0.5407, 0.463/0.3367)=(1.5576 , 1.3279, 1.3751) and (0.275/0.4231, 0.373/0.5407, 0.212/0.3367) = (0.6500, 0.6898, 0.6296).

此时,由I′R、I′G、I′B组成的图像I′的均值mean(I′)计算得到的值为0.9985,原始图像亮度的均值mean(I)为0.4170,所以,示例点1拉伸到原始图像亮度的I′R、I′G、I′B分别变为:At this time, the mean (I') of the image I' composed of I' R , I' G , and I' B is calculated to be 0.9985, and the mean (I) of the original image brightness is 0.4170. Therefore, the example point 1 The I′ R , I′ G , and I′ B stretched to the brightness of the original image become:

同理,示例点2拉伸到原始图像亮度对应的I″R、I″G、I″B分别为(0.2714,0.2881,0.2630)。通过这一更新,原始图像色偏被去除,图3展示了去除色偏之后对应的图像,可以看出图像的绿色色偏被有效去除。Similarly, the example point 2 is stretched to the corresponding I″ R , I″ G , and I″ B of the brightness of the original image to be (0.2714, 0.2881, 0.2630) respectively. Through this update, the color cast of the original image is removed, as shown in Figure 3 The corresponding image after the color cast is removed, it can be seen that the green color cast of the image is effectively removed.

S4、计算颜色通道与亮度通道感受野外周输入:对S1得到的亮度通道I以及S3得到R、G、B三通道更新后的值I″R、I″G、I″B分别进行均值滤波,得到四个通道的感受野外周输入fsI、fsR、fsG、fsBS4. Calculate the color channel and the brightness channel to experience the peripheral input: the brightness channel I obtained by S1 and the updated values I″ R , I″ G , and I″ B of the R, G, and B three channels obtained by S3 are respectively subjected to mean value filtering, Peripheral inputs f sI , f sR , f sG , f sB of the four channels are obtained.

本实施例中,以窗宽为9*9的均值滤波器为例,对S1得到的亮度图I进行均值滤波,得到滤波后两个示例像素点对应位置的值为fsI分别为0.7839和0.1327。对S3得到的更新后的RGB三个通道的图像I″R、I″G、I″B进行均值滤波,两个示例点对应位置经均值滤波之后fsR、fsG、fsB分别为(0.6597,0.7313,0.7448)和(0.2930,0.1926,0.1304)。In this embodiment, taking the average filter with a window width of 9*9 as an example, the brightness image I obtained by S1 is average-filtered, and the values of the corresponding positions of the two example pixel points after filtering are obtained as f sI are 0.7839 and 0.1327 respectively . The images I″ R , I″ G , and I″ B of the updated RGB three channels obtained by S3 are subjected to mean value filtering, and f sR , f sG , and f sB of the corresponding positions of the two example points are respectively ( 0.6597 , 0.7313, 0.7448) and (0.2930, 0.1926, 0.1304).

S5、计算亮度通道感受野中心输入:计算亮度通道的均值,原始输入图像(图2)对应的亮度通道I的均值M为0.4170,由于它小于0.5,所以使用sigmoid函数:S5, calculate the center input of the brightness channel receptive field: calculate the mean value of the brightness channel, the mean value M of the brightness channel I corresponding to the original input image (Fig. 2) is 0.4170, because it is less than 0.5, so use the sigmoid function:

计算亮度通道感受野的中心输入fcI。将S1计算得到的两个示例点的亮度I 0.613与0.286分别代入上公式,计算得到亮度通道的感受野中心输入fcI分别为0.7559和0.1053。Computes the center input f cI of the receptive field of the luma channel. Substitute the luminance I 0.613 and 0.286 of the two example points calculated by S1 into the above formula, and calculate the receptive field center input f cI of the luminance channel to be 0.7559 and 0.1053 respectively.

同时,对S3计算得到的I″R、I″G、I″B的值(0.6505,0.5546,0.5743)采用相同的函数进行处理,代入得到两个示例像素点I″R、I″G、I″B更新后的值分别为(0.8183,0.6332,0.6777)和(0.0924,0.1073,0.0855)。At the same time, the values (0.6505, 0.5546, 0.5743) of I″ R , I″ G , and I″ B calculated by S3 are processed by the same function, and two example pixels I″ R , I″ G , I″ are substituted into ″ The updated values of B are (0.8183, 0.6332, 0.6777) and (0.0924, 0.1073, 0.0855), respectively.

图4展示了I″R、I″G、I″B经过再次更新后对应的图像。FIG. 4 shows the corresponding images after I″ R , I″ G , and I″ B are updated again.

S6、计算颜色通道与亮度通道感受野外周所占权重:使用k表示RGB通道与亮度通道感受野外周权重,其计算公式为:S6. Calculate the weight of the color channel and the brightness channel to experience the field week: use k to represent the weight of the RGB channel and the brightness channel to experience the field week, and the calculation formula is:

这里,为避免出现图像被过度增强的现象,k值应设置合理上限,在本实施例中将这一上限设置为0.4,即kMAX=0.4,λ表示R、G、B三个通道,A为每个通道对应的最大值。I′λ(x,y)为经步骤S5处理后I″R、I″G、I″B对应(x,y)位置的像素值。此时,RGB三通道的最大值Aλ分别为:0.9465,0.8778,0.9333,RGB三个通道的计算结果分别为:0.2103,0.2066,0.1432,所以因为0.2151<0.4,所以,k的值为0.2151。Here, in order to avoid the phenomenon that the image is over-enhanced, a reasonable upper limit should be set for the value of k. In this embodiment, this upper limit is set to 0.4, that is, k MAX =0.4, λ represents three channels of R, G, and B, and A is the maximum value corresponding to each channel. I' λ (x, y) is the pixel value corresponding to (x, y) position of I " R , I " G , I " B after step S5 processing. At this moment, the maximum value A λ of RGB three channels is respectively: 0.9465, 0.8778, 0.9333, RGB three channels The calculation results are: 0.2103, 0.2066, 0.1432, so Since 0.2151<0.4, the value of k is 0.2151.

S7、计算亮度通道感受野响应:将步骤S4与S5计算得到亮度通道感受野中心和外周输入fcI与fsI代入双高斯差函数,计算得到亮度通道的感受野响应值,计算公式为:S7. Calculating the receptive field response of the luminance channel: Substituting the receptive field center and peripheral inputs f cI and f sI of the luminance channel calculated in steps S4 and S5 into the double Gaussian difference function to calculate the receptive field response value of the luminance channel. The calculation formula is:

本实施例中,以σc=0.5,σs=1.5,m=n=9为例,在此处计算得两个示例点的亮度通道的感受野响应值rodBp分别为0.4745和0.2608。In this embodiment, taking σ c =0.5, σ s =1.5, m=n=9 as an example, the receptive field response values roddB p of the luminance channel of two example points are calculated to be 0.4745 and 0.2608 respectively.

图5展示了亮度通道感受野的响应图。Figure 5 shows the response plot of the receptive field for the luma channel.

S8、计算RGB三通道感受野的中心输入:对S7得到的亮度通道感受野输出rodBp进行gamma变换,与经步骤S5处理得到的I″R、I″G、I″B共同构成R、G、B三通道的感受野中心输入fcS8, calculate the central input of the RGB three-channel receptive field: perform gamma transformation on the brightness channel receptive field output rod dB p obtained in S7, and form R and G together with the I″ R , I″ G , and I″ B obtained through the processing of step S5 , Input f c of the receptive field center of the three channels B.

本实施例所述的gamma变换中采用γ=0.5为例,计算得RGB三通道感受野的中心输入fcR,fcG,fcB。则示例点1对应的计算过程与结果为:In the gamma transformation described in this embodiment, γ=0.5 is taken as an example, and the center inputs f cR , f cG , and f cB of the RGB three-channel receptive fields are calculated. Then the calculation process and result corresponding to example point 1 are:

fcB=I″R*rodBp γ=0.8183*0.47450.5=0.5637f cB =I″ R *rodB p γ =0.8183*0.4745 0.5 =0.5637

fcG=I″G*rodBp γ=0.6332*0.47450.5=0.4363f cG =I″ G *rodB p γ =0.6332*0.4745 0.5 =0.4363

fcB=I″B*rodBp γ=0.6777*0.47450.5=0.4669f cB =I″ B *rodB p γ =0.6777*0.4745 0.5 =0.4669

同样,示例点2对应的感受野的中心输入fcR,fcG,fcB计算结果为(0.0637,0.0739,0.0589)。Similarly, the center input f cR , f cG , and f cB of the receptive field corresponding to example point 2 are calculated as (0.0637, 0.0739, 0.0589).

S9、计算RGB三通道感受野响应并输出:同步骤S7,在本实施例中使用双高斯差函数来计算R、G、B三通道的感受野响应:S9. Calculating and outputting the receptive field responses of the RGB three-channels: Same as step S7, in this embodiment, the double Gaussian difference function is used to calculate the receptive field responses of the R, G, and B channels:

将S8计算得到的fcR,fcG,fcB与S6计算得到的k代入上公式,σc与σs采用与步骤S7相同的取值σc=0.5,σs=1.5,m=n=9,得到三通道的感受野响应BpR、BpG、BpB。两个示例点对应位置的BpR、BpG、BpB计算结果分别为(1,0.778,0.865)和(0.022,0.0184,0.0376)。最后,将计算得到的结果输出。Substitute f cR , f cG , f cB calculated in S8 and k calculated in S6 into the above formula, σ c and σ s adopt the same values as in step S7 σ c =0.5, σ s =1.5, m=n= 9. Obtain the three-channel receptive field responses B pR , B pG , and B pB . The calculation results of B pR , B pG , and B pB at the corresponding positions of the two example points are (1,0.778,0.865) and (0.022,0.0184,0.0376) respectively. Finally, output the calculated results.

图6展示了最终输出的图像,相比与原始图像(图2),图像的色偏以及模糊被有效去除。Figure 6 shows the final output image. Compared with the original image (Figure 2), the color cast and blur of the image are effectively removed.

以上的简单实例主要以图像的单个像素值作为例子来阐述,实际计算时是在整幅图像的所有像素上进行的。通过这样一个简单实例,阐述了本发明模拟鱼类视网膜去除色偏以及模糊的全过程。The above simple examples are mainly explained by taking a single pixel value of an image as an example, and the actual calculation is performed on all pixels of the entire image. Through such a simple example, the whole process of simulating fish retina removing color shift and blurring in the present invention is illustrated.

这里所述的实施例是为了帮助读者理解本发明的原理,应被理解为本发明的保护范围并不局限于这样的特别陈述和实施例。本领域的普通技术人员可以根据本发明公开的这些技术启示做出各种不脱离本发明实质的其他各种具体变形和组合,这些变形和组合仍然在本发明的保护范围内。The embodiments described here are to help readers understand the principles of the present invention, and it should be understood that the protection scope of the present invention is not limited to such specific statements and embodiments. Those skilled in the art can make various other specific modifications and combinations based on the technical revelations disclosed in the present invention without departing from the essence of the present invention, and these modifications and combinations are still within the protection scope of the present invention.

Claims (9)

1. An underwater image enhancement method based on a fish retina mechanism comprises the following steps:
s1, extracting color component and brightness component: respectively extracting red component I from each pixel point of underwater imageRGreen component IGBlue component IBAnd calculates the average luminance component I:
I=(IR+IG+IB)/3
s2, calculating the adjusted mean value of the RGB three channels: calculating the brightest pixel value in the red channel that is greater than a first thresholdAverage value M of partial pixel pointsrThe average value M of the green channel and the blue channel is calculated as the adjusted average value of the red channelg、Mb
S3, correcting color cast of the image: dividing each pixel point of R, G, B three channels by the corresponding mean value thereof, and obtaining the updated value I 'of each channel after the processing is finished'R、I′G、I′BThe specific calculation formula is as follows;
<mrow> <msubsup> <mi>I</mi> <mi>R</mi> <mo>&amp;prime;</mo> </msubsup> <mo>=</mo> <mfrac> <msub> <mi>I</mi> <mi>R</mi> </msub> <msub> <mi>M</mi> <mi>r</mi> </msub> </mfrac> </mrow>
<mrow> <msubsup> <mi>I</mi> <mi>G</mi> <mo>&amp;prime;</mo> </msubsup> <mo>=</mo> <mfrac> <msub> <mi>I</mi> <mi>G</mi> </msub> <msub> <mi>M</mi> <mi>g</mi> </msub> </mfrac> </mrow>
<mrow> <msubsup> <mi>I</mi> <mi>B</mi> <mo>&amp;prime;</mo> </msubsup> <mo>=</mo> <mfrac> <msub> <mi>I</mi> <mi>B</mi> </msub> <msub> <mi>M</mi> <mi>b</mi> </msub> </mfrac> </mrow>
and then stretching the updated value to the brightness of the original image, wherein the specific calculation formula is as follows:
<mrow> <msubsup> <mi>I</mi> <mi>R</mi> <mrow> <mo>&amp;prime;</mo> <mo>&amp;prime;</mo> </mrow> </msubsup> <mo>=</mo> <mfrac> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>I</mi> <mo>)</mo> </mrow> <mo>*</mo> <msubsup> <mi>I</mi> <mi>R</mi> <mo>&amp;prime;</mo> </msubsup> </mrow> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
<mrow> <msubsup> <mi>I</mi> <mi>G</mi> <mrow> <mo>&amp;prime;</mo> <mo>&amp;prime;</mo> </mrow> </msubsup> <mo>=</mo> <mfrac> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>I</mi> <mo>)</mo> </mrow> <mo>*</mo> <msubsup> <mi>I</mi> <mi>G</mi> <mo>&amp;prime;</mo> </msubsup> </mrow> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
<mrow> <msubsup> <mi>I</mi> <mi>B</mi> <mrow> <mo>&amp;prime;</mo> <mo>&amp;prime;</mo> </mrow> </msubsup> <mo>=</mo> <mfrac> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>I</mi> <mo>)</mo> </mrow> <mo>*</mo> <msubsup> <mi>I</mi> <mi>B</mi> <mo>&amp;prime;</mo> </msubsup> </mrow> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
wherein I 'represents I'R、I′G、I′BForming an image, wherein mean represents the mean value of the image;
s4, calculating color channel and brightness channel, and sensing field input: for step S1The obtained luminance component I and the step S3 obtain R, G, B three channels of updated values (I ″)R、I″G、I″B) Respectively filtering to obtain peripheral input f of the receptive fields of the four channelssI、fsR、fsG、fsB
S5, calculating the center input of the receptive field of the brightness channel:
calculating the average value M of the brightness channel I obtained in the step S1, and if M is smaller than a second threshold value, inputting the center f of the brightness channel receptive fieldcIAdopting sigmoid function to adjust, and simultaneously, obtaining (I') obtained in step S3R、I″G、I″B) Updating again by adopting sigmoid; otherwise make fcIIs ═ I, and (I ″)R、I″G、I″B) Updating is not carried out;
s6, calculating the weight occupied by the color channel and the brightness channel in the field experience period: k represents the RGB channel and brightness channel sensing field week weight, and the calculation formula is as follows:
<mrow> <mi>k</mi> <mo>=</mo> <mi>m</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>k</mi> <mrow> <mi>M</mi> <mi>A</mi> <mi>X</mi> </mrow> </msub> <mo>,</mo> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mo>(</mo> <mrow> <munder> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> <mi>&amp;lambda;</mi> </munder> <mfrac> <mrow> <msubsup> <mi>I</mi> <mi>&amp;lambda;</mi> <mrow> <mo>&amp;prime;</mo> <mo>&amp;prime;</mo> </mrow> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msub> <mi>A</mi> <mi>&amp;lambda;</mi> </msub> </mfrac> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow>
where λ represents R, G, B three channels, and a is the maximum value for each channel. I ″)λ(x, y) is I ″, which is processed in step S5R、I″G、I″BPixel value, k, corresponding to the (x, y) positionMAXThe upper limit of the k value.
S7, calculating the response of the brightness channel receptive field: inputting the central and peripheral receptive field inputs f calculated in steps S4 and S5cIAnd fsISubstituting the double Gaussian difference function to calculate the receptive field response value of the brightness channel, wherein the specific calculation formula is as follows:
<mrow> <msub> <mi>rodB</mi> <mi>p</mi> </msub> <mo>=</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>&amp;lsqb;</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>f</mi> <mrow> <mi>c</mi> <mi>I</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>k</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>f</mi> <mrow> <mi>s</mi> <mi>I</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>n</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow>
wherein,representing a convolution, fcI(x,y)、fsI(x, y) represents the receptive field center and peripheral inputs for point (x, y) in the image, g (m, n;σc)、g(m,n;σs) Representing a two-dimensional Gaussian function of size m x n, rodBpNamely the receptive field output result of the brightness channel.
S8, calculating the central input of RGB three-channel receptive field: outputting rodB to the brightness channel receptive field obtained in the step S7pGamma conversion is carried out to obtain rodBp γAnd is compared with the I' processed in step S5R、I″G、I″BThe multiplication jointly forms the central input f of the receptive field of R, G, B three channelscThe specific calculation formula is as follows:
fcR=I″R*rodBp γ
fcG=I″G*rodBp γ
fcB=I″B*rodBp γ
wherein denotes a multiplication number;
s9, calculating RGB three-channel receptive field response and outputting: in the same step S7, the central input f of the receptor field of R, G, B three channels is obtained by calculating the steps S5 and S8cR、fcG、fcBAnd peripheral input fsR、fsG、fsBSubstituting the double Gaussian difference function to calculate the receptive field response of R, G, B three channels, wherein the specific calculation formula is as follows:
<mrow> <msub> <mi>B</mi> <mrow> <mi>p</mi> <mi>R</mi> </mrow> </msub> <mo>=</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>&amp;lsqb;</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>f</mi> <mrow> <mi>c</mi> <mi>R</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>k</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>f</mi> <mrow> <mi>s</mi> <mi>R</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow>
<mrow> <msub> <mi>B</mi> <mrow> <mi>p</mi> <mi>G</mi> </mrow> </msub> <mo>=</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>&amp;lsqb;</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>f</mi> <mrow> <mi>c</mi> <mi>G</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>k</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>f</mi> <mrow> <mi>s</mi> <mi>G</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow>
<mrow> <msub> <mi>B</mi> <mrow> <mi>p</mi> <mi>B</mi> </mrow> </msub> <mo>=</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mo>&amp;lsqb;</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>f</mi> <mrow> <mi>c</mi> <mi>B</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>k</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>f</mi> <mrow> <mi>s</mi> <mi>B</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow>
r, G, B receptor field response of three channelspR、BpG、BpBNamely, the defogged image after the enhancement of the three channels is recombined into an RGB image as the final output.
2. The underwater image enhancement method based on fish retina mechanism of claim 1, wherein said first threshold of step S2 is 0.1, and said second threshold of step S5 is 0.5.
3. The underwater image enhancement method based on the fish retina mechanism of claim 1, wherein the brightest portion of the pixels in step S2 is specifically the brightest 50% of the pixels.
4. The underwater image enhancement method based on the fish retina mechanism of claim 1, wherein the filtering of step S4 is mean filtering.
5. The underwater image enhancement method based on fish retina mechanism as claimed in claim 1, wherein the underwater image enhancement method is characterized in thatIn step S2, to avoid the adjusted red color channel mean value being too high, the adjusted red color channel mean value MrGreater than the mean value M of the green channelgUsing the mean value M of the green channelgAs the final adjusted mean value of the red channel, namely:
Mr=min(Mr,Mg)。
6. the underwater image enhancement method based on fish retina mechanism of claim 1, wherein step S5 inputs f to the center of the brightness channel receptive fieldcIThe sigmoid function is adopted for regulation specifically as follows:
<mrow> <msub> <mi>f</mi> <mrow> <mi>c</mi> <mi>I</mi> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mn>10</mn> <mrow> <mo>(</mo> <mi>I</mi> <mo>-</mo> <mn>0.5</mn> <mo>)</mo> </mrow> </mrow> </msup> <mo>)</mo> </mrow> </mfrac> <mo>.</mo> </mrow>
7. the underwater image enhancement method based on the fish retina mechanism of claim 4, wherein the window width of the mean filter of step S4 is any size greater than 3 × 3 and less than 15 × 15.
8. The underwater image enhancement method based on fish retina mechanism of claim 1, wherein the gaussian functions of the center and periphery of the receptive field in steps S7 and S9 are specifically:
<mrow> <msub> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <msub> <mi>&amp;pi;&amp;sigma;</mi> <mi>c</mi> </msub> </mrow> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <mrow> <msup> <mi>m</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>n</mi> <mn>2</mn> </msup> </mrow> <mrow> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mi>c</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> </mrow> </msup> </mrow>2
<mrow> <msub> <mi>g</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>;</mo> <msub> <mi>&amp;sigma;</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>2</mn> <msub> <mi>&amp;pi;&amp;sigma;</mi> <mi>s</mi> </msub> </mrow> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <mrow> <msup> <mi>m</mi> <mn>2</mn> </msup> <mo>+</mo> <msup> <mi>n</mi> <mn>2</mn> </msup> </mrow> <mrow> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mi>s</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> </mrow> </msup> </mrow>
wherein,cthe value range of (A) is specifically 0.2-0.8,stake a value ofcThe value range of m and n is specifically an integer of 5-15.
9. The underwater image enhancement method based on the fish retina mechanism according to claim 1, wherein a value range of γ in step S8 is specifically 0.4-0.6.
CN201710573257.XA 2017-07-10 2017-07-10 An underwater image enhancement method based on fish retina mechanism Active CN107169942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710573257.XA CN107169942B (en) 2017-07-10 2017-07-10 An underwater image enhancement method based on fish retina mechanism

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710573257.XA CN107169942B (en) 2017-07-10 2017-07-10 An underwater image enhancement method based on fish retina mechanism

Publications (2)

Publication Number Publication Date
CN107169942A true CN107169942A (en) 2017-09-15
CN107169942B CN107169942B (en) 2020-07-07

Family

ID=59818650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710573257.XA Active CN107169942B (en) 2017-07-10 2017-07-10 An underwater image enhancement method based on fish retina mechanism

Country Status (1)

Country Link
CN (1) CN107169942B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107909617A (en) * 2017-11-13 2018-04-13 四川大学 A kind of light source colour method of estimation based on non-linear contrast's weighting
CN108537852A (en) * 2018-04-17 2018-09-14 四川大学 A kind of adaptive color shape constancy method based on Image Warping
CN109919873A (en) * 2019-03-07 2019-06-21 电子科技大学 A Fundus Image Enhancement Method Based on Image Decomposition
CN111639588A (en) * 2020-05-28 2020-09-08 深圳壹账通智能科技有限公司 Image effect adjusting method, device, computer system and readable storage medium
CN112348904A (en) * 2020-10-23 2021-02-09 影石创新科技股份有限公司 Underwater image and underwater video color restoration method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955900A (en) * 2014-05-07 2014-07-30 电子科技大学 Image defogging method based on biological vision mechanism
US20150063718A1 (en) * 2013-08-30 2015-03-05 Qualcomm Incorported Techniques for enhancing low-light images
CN105825483A (en) * 2016-03-21 2016-08-03 电子科技大学 Haze and dust removing method for image
CN106127823A (en) * 2016-06-24 2016-11-16 电子科技大学 A kind of coloured image dynamic range compression method
CN106600547A (en) * 2016-11-17 2017-04-26 天津大学 Underwater image restoration method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150063718A1 (en) * 2013-08-30 2015-03-05 Qualcomm Incorported Techniques for enhancing low-light images
CN103955900A (en) * 2014-05-07 2014-07-30 电子科技大学 Image defogging method based on biological vision mechanism
CN105825483A (en) * 2016-03-21 2016-08-03 电子科技大学 Haze and dust removing method for image
CN106127823A (en) * 2016-06-24 2016-11-16 电子科技大学 A kind of coloured image dynamic range compression method
CN106600547A (en) * 2016-11-17 2017-04-26 天津大学 Underwater image restoration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN-JUI CHUNG ; WEI-YAO CHOU ; CHIA-WEN LIN: "《Under-exposed image enhancement using exposure compensation》", 《2013 13TH INTERNATIONAL CONFERENCE ON ITS TELECOMMUNICATIONS (ITST)》 *
石丹; 李庆武; 范新南; 霍冠英: "《基于Contourlet变换和多尺度Rentinex的水下图像增强算法》", 《激光与光电子学进展》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107909617A (en) * 2017-11-13 2018-04-13 四川大学 A kind of light source colour method of estimation based on non-linear contrast's weighting
CN107909617B (en) * 2017-11-13 2020-03-17 四川大学 Light source color estimation method based on nonlinear contrast weighting
CN108537852A (en) * 2018-04-17 2018-09-14 四川大学 A kind of adaptive color shape constancy method based on Image Warping
CN108537852B (en) * 2018-04-17 2020-07-07 四川大学 An adaptive color constancy method based on image local contrast
CN109919873A (en) * 2019-03-07 2019-06-21 电子科技大学 A Fundus Image Enhancement Method Based on Image Decomposition
CN111639588A (en) * 2020-05-28 2020-09-08 深圳壹账通智能科技有限公司 Image effect adjusting method, device, computer system and readable storage medium
CN112348904A (en) * 2020-10-23 2021-02-09 影石创新科技股份有限公司 Underwater image and underwater video color restoration method and device

Also Published As

Publication number Publication date
CN107169942B (en) 2020-07-07

Similar Documents

Publication Publication Date Title
US11625815B2 (en) Image processor and method
CN112785534B (en) A method for removing ghosting and multi-exposure image fusion in dynamic scenes
CN102129673B (en) Color digital image enhancing and denoising method under random illumination
CN107169942B (en) An underwater image enhancement method based on fish retina mechanism
CN104618700B (en) Enhanced display method for color high dynamic range image
CN105205794B (en) A kind of synchronous enhancing denoising method of low-light (level) image
CN106875352A (en) A kind of enhancement method of low-illumination image
CN107045715A (en) A kind of method that single width low dynamic range echograms generates high dynamic range images
CN103530848A (en) Double exposure implementation method for inhomogeneous illumination image
CN108022223B (en) A Tone Mapping Method Based on Logarithmic Mapping Function Block Processing and Fusion
CN106204662B (en) A kind of color of image constancy method under multiple light courcess environment
Lv et al. Low-light image enhancement via deep Retinex decomposition and bilateral learning
TWI520101B (en) Method for making up skin tone of a human body in an image, device for making up skin tone of a human body in an image, method for adjusting skin tone luminance of a human body in an image, and device for adjusting skin tone luminance of a human body in
CN106169182A (en) A kind of method synthesizing several different exposure images
CN110335221A (en) A multi-exposure image fusion method based on unsupervised learning
CN107492075A (en) A kind of method of individual LDR image exposure correction based on details enhancing
Tang et al. A local flatness based variational approach to retinex
Liu et al. Color enhancement using global parameters and local features learning
CN109035155A (en) A kind of more exposure image fusion methods of halation removal
CN118781001B (en) A fast low-light image sharpening method based on depthwise separable convolution
CN111462002A (en) Underwater image enhancement and restoration method based on convolutional neural network
CN103955900B (en) Image defogging method based on biological vision mechanism
CN109671044B (en) A kind of more exposure image fusion methods decomposed based on variable image
CN106157305A (en) High-dynamics image rapid generation based on local characteristics
CN106127823B (en) A kind of color image dynamic range compression method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant