[go: up one dir, main page]

CN104217436B - SAR image segmentation method based on multiple features combining sparse graph - Google Patents

SAR image segmentation method based on multiple features combining sparse graph Download PDF

Info

Publication number
CN104217436B
CN104217436B CN201410472497.7A CN201410472497A CN104217436B CN 104217436 B CN104217436 B CN 104217436B CN 201410472497 A CN201410472497 A CN 201410472497A CN 104217436 B CN104217436 B CN 104217436B
Authority
CN
China
Prior art keywords
superpixels
superpixel
feature
sar image
histogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410472497.7A
Other languages
Chinese (zh)
Other versions
CN104217436A (en
Inventor
焦李成
古晶
马文萍
杨淑媛
刘红英
熊涛
侯彪
王爽
霍丽娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410472497.7A priority Critical patent/CN104217436B/en
Publication of CN104217436A publication Critical patent/CN104217436A/en
Application granted granted Critical
Publication of CN104217436B publication Critical patent/CN104217436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

本发明公开了一种基于多特征联合稀疏图的SAR图像分割方法,主要解决现有方法对分割结果产生不利影响以及图像规模增大影响计算复杂度的问题。其实现步骤为:1)输入待分割图像;2)将待分割图像进行过分割得到超像素;3)提取超像素的四个特征集;4)对四个特征集进行联合稀疏表示得到四个稀疏表示系数;5)将四个稀疏表示系数融合为一个全局稀疏表示系数;6)计算超像素的局部空间邻域相关性;7)将全局稀疏表示系数与局部空间邻域相关性相结合生成联合稀疏图;8)使用谱聚类算法分割联合稀疏图的顶点得到最终的分割结果。本发明能有效分割SAR图像并不受噪声影响,可用于SAR图像的自动目标识别。

The invention discloses a SAR image segmentation method based on a multi-feature joint sparse graph, which mainly solves the problems that the existing method has adverse effects on the segmentation results and the computational complexity is affected by the increase of the image scale. The implementation steps are: 1) Input the image to be segmented; 2) Over-segment the image to be segmented to obtain superpixels; 3) Extract four feature sets of superpixels; 4) Jointly sparsely represent the four feature sets to obtain four Sparse representation coefficients; 5) Fusion of four sparse representation coefficients into one global sparse representation coefficient; 6) Computing local spatial neighborhood correlation of superpixels; 7) Combining global sparse representation coefficients with local spatial neighborhood correlation to generate Joint sparse graph; 8) Use the spectral clustering algorithm to segment the vertices of the joint sparse graph to obtain the final segmentation result. The invention can effectively segment the SAR image and is not affected by noise, and can be used for automatic target recognition of the SAR image.

Description

基于多特征联合稀疏图的SAR图像分割方法SAR image segmentation method based on multi-feature joint sparse graph

技术领域technical field

本发明属于图像处理技术领域,特别涉及一种SAR图像的分割,用于SAR图像目标识别。The invention belongs to the technical field of image processing, in particular to a SAR image segmentation used for SAR image target recognition.

背景技术Background technique

合成孔径雷达SAR是一种高分辨成像雷达,能穿透云层和植被,几乎不受气候条件影响,可以全天时、全天候进行工作,因此在地面监测和识别中得到广泛应用。SAR系统的自身特点导致SAR图像不像光学遥感图像直观,所以SAR图像的后期理解和解译也极为关键。SAR图像分割作为SAR图像解译的关键环节之一,近些年来已经引起了广泛关注。已有很多方法被用于SAR图像分割,如:阈值、聚类、支撑矢量积、马尔科夫随机场等。Synthetic Aperture Radar (SAR) is a high-resolution imaging radar that can penetrate clouds and vegetation and is almost unaffected by climate conditions. It can work all day and all day, so it is widely used in ground monitoring and identification. Due to the characteristics of the SAR system, SAR images are not as intuitive as optical remote sensing images, so the later understanding and interpretation of SAR images is also extremely critical. SAR image segmentation, as one of the key links in SAR image interpretation, has attracted extensive attention in recent years. Many methods have been used for SAR image segmentation, such as: threshold, clustering, support vector product, Markov random field and so on.

基于支撑矢量积和马尔科夫随机场的分割方法需要少量的样本,这对于SAR图像是很难准确获取的。而对于阈值分割方法,通常很难选取一个合适的阈值。目前已有很多成熟的聚类算法被用于SAR图像分割,但大部分聚类方法都是基于像素的;这类方法的复杂度会随着图像规模的增加而急剧增大,从而影响聚类方法的推广与应用。Segmentation methods based on support vector products and Markov random fields require a small number of samples, which are difficult to obtain accurately for SAR images. For the threshold segmentation method, it is usually difficult to choose an appropriate threshold. At present, many mature clustering algorithms have been used for SAR image segmentation, but most of the clustering methods are based on pixels; the complexity of such methods will increase sharply with the increase of image size, thus affecting the clustering. Method promotion and application.

另外,由于SAR图像是通过雷达回波的干涉作用而获得的,所以SAR图像本身不可避免的含有相干斑噪声。众所周知,在SAR图像解译中,对SAR图像相干斑噪声的抑制是极为关键的。已有的SAR图像分割算法为了避免噪声对分割结果的影响,大部分方法都先使用滤波算法对SAR图像进行预处理,再应用其它技术对其进行分割。然而,滤波算法在降低噪声的同时,图像中的边缘和纹理会变模糊,这就造成了图像中部分细节不可逆转的丢失,进而可能导致最终分割结果不精确。In addition, since the SAR image is obtained through the interference of radar echoes, the SAR image itself inevitably contains coherent speckle noise. As we all know, in SAR image interpretation, the suppression of coherent speckle noise in SAR image is extremely critical. In order to avoid the influence of noise on the segmentation results, most of the existing SAR image segmentation algorithms use filtering algorithm to preprocess the SAR image, and then apply other techniques to segment it. However, while the filtering algorithm reduces the noise, the edges and textures in the image will become blurred, which causes the irreversible loss of some details in the image, which may lead to inaccurate final segmentation results.

发明内容Contents of the invention

本发明的目的在于针对上述已有技术的不足,提出一种基于联合稀疏图的SAR图像分割方法,以避免相干斑噪声和参数选择的影响,获得较好的SAR图像分割结果。The purpose of the present invention is to address the deficiencies of the above-mentioned prior art, and propose a SAR image segmentation method based on a joint sparse graph, so as to avoid the influence of coherent speckle noise and parameter selection, and obtain better SAR image segmentation results.

为实现上述目的,本发明的技术方案包括如下步骤:To achieve the above object, the technical solution of the present invention comprises the following steps:

(1)输入待分割SAR图像,确定需要将图像划分的类数k≥2;(1) Input the SAR image to be segmented, and determine the number of categories k ≥ 2 that need to be divided into the image;

(2)使用区域生长算法对输入的待分割SAR图像进行过分割,生成多个表示同质区域的超像素集Y;(2) Use the region growing algorithm to over-segment the input SAR image to be segmented, and generate multiple superpixel sets Y representing homogeneous regions;

(3)对超像素集Y中的每个超像素提取其灰度直方图特征、局部二值模式直方图特征、Gabor滤波器组特征、灰度共生矩阵统计特征,构成灰度直方图特征集X1、局部二值模式直方图特征集X2、Gabor滤波器组特征集X3、灰度共生矩阵概率统计特征集X4(3) For each superpixel in the superpixel set Y, extract its grayscale histogram feature, local binary mode histogram feature, Gabor filter bank feature, and grayscale co-occurrence matrix statistical feature to form a grayscale histogram feature set X 1 , local binary pattern histogram feature set X 2 , Gabor filter bank feature set X 3 , gray level co-occurrence matrix probability statistics feature set X 4 ;

(4)对所有超像素的上述四个特征集X1,X2,X3,X4进行联合稀疏表示,分别得到四个特征集合的稀疏表示系数Z1,Z2,Z3,Z4,每个稀疏表示系数的大小为n×n,n为超像素个数;(4) Jointly sparsely represent the above four feature sets X 1 , X 2 , X 3 , and X 4 of all superpixels, and obtain the sparse representation coefficients Z 1 , Z 2 , Z 3 , and Z 4 of the four feature sets respectively , the size of each sparse representation coefficient is n×n, where n is the number of superpixels;

(5)在四个n×n大小的稀疏表示系数Z1,Z2,Z3,Z4中依次分别取任意相同位置的元素[Z1]ij,[Z2]ij,[Z3]ij,[Z4]ij,其中,1≤i≤n,1≤j≤n,将这四个元素融合,得到全局稀疏表示系数中相同位置的元素对四个稀疏表示系数中的每个元素都按照上式进行融合,得到全局稀疏表示系数 (5) Among the four sparse representation coefficients Z 1 , Z 2 , Z 3 , Z 4 of size n×n, take any element [Z1] ij , [Z2] ij , [Z3] ij , [ Z4] ij , among them, 1≤i≤n, 1≤j≤n, these four elements are fused to obtain the elements at the same position in the global sparse representation coefficient Each element of the four sparse representation coefficients is fused according to the above formula to obtain the global sparse representation coefficient

(6)根据超像素的相邻关系构造出超像素的局部空间邻域相关性C,该局部空间邻域相关性C是一个以超像素个数为行数和列数的方阵,对于任意两个超像素Yi与Yj,当Yi与Yj相邻时,C中第i行的第j个元素Cij的值为1,否则Cij的值为0;(6) Construct the local spatial neighborhood correlation C of superpixels according to the adjacent relationship of superpixels. The local spatial neighborhood correlation C is a square matrix with the number of superpixels as the number of rows and columns. For any Two superpixels Y i and Y j , when Y i and Y j are adjacent, the value of the j-th element C ij in the i-th row of C is 1, otherwise the value of C ij is 0;

(7)将全局稀疏表示系数S与局部空间邻域相关性C相结合,生成联合稀疏图的邻接矩阵G=(1/2)(P+PT),其中P=S·exp(C/2σ2),T表示转置,σ是一个正则参数,取值为1.4;(7) Combine the global sparse representation coefficient S with the local spatial neighborhood correlation C to generate the adjacency matrix G=(1/2)(P+P T ) of the joint sparse graph, where P=S·exp(C/ 2σ 2 ), T means transpose, σ is a regular parameter with a value of 1.4;

(8)使用谱聚类中规范切Ncuts算法将联合稀疏图的顶点划分为k类,得到最终分割结果。(8) Use the canonical cut Ncuts algorithm in spectral clustering to divide the vertices of the joint sparse graph into k classes, and obtain the final segmentation result.

本发明与现有技术相比具有如下优点:Compared with the prior art, the present invention has the following advantages:

1)本发明中不需要分别使用两种方法去降低噪声和挖掘图像的全局结构,而仅使用一个基于超像素多种特征的联合稀疏图就可以达到相同的目的,所以在不影响计算复杂度的前提下,不仅增大了图像规模,而且避免了滤波算法对分割结果的影响。1) In the present invention, there is no need to use two methods to reduce noise and mine the global structure of the image, but only use a joint sparse map based on multiple features of superpixels to achieve the same purpose, so without affecting the computational complexity Under the premise, it not only increases the size of the image, but also avoids the influence of the filtering algorithm on the segmentation results.

2)本发明通过利用图像的局部空间邻域信息,增强了联合稀疏图的分类能力,从而得到更好的图像分割结果。2) The present invention enhances the classification ability of the joint sparse graph by utilizing the local spatial neighborhood information of the image, thereby obtaining better image segmentation results.

附图说明Description of drawings

图1是本发明的整体实现流程图;Fig. 1 is the overall realization flowchart of the present invention;

图2是本发明与现有方法对二类SAR图像的分割结果对比图;Fig. 2 is the comparison figure of the segmentation result of the present invention and existing method to two kinds of SAR images;

图3是本发明与现有方法对三类SAR图像的分割结果对比图。Fig. 3 is a comparison diagram of segmentation results of three types of SAR images between the present invention and the existing method.

具体实施方式detailed description

参照图1,本发明的具体实现步骤如下:With reference to Fig. 1, the concrete realization steps of the present invention are as follows:

步骤1,输入待分割SAR图像,根据图像内容判断需要识别的主要目标及背景,确定分割类数k,本实例中的k取值为2和3。Step 1, input the SAR image to be segmented, judge the main target and background to be recognized according to the image content, and determine the number of segmentation classes k, the values of k in this example are 2 and 3.

步骤2,使用区域生长算法生成超像素。Step 2, use the region growing algorithm to generate superpixels.

(2a)对于所有像素还没有被合并到区域的待分割SAR图像,设图像左上角的第一个像素为(x0,y0);(2a) For the SAR image to be segmented where all pixels have not been merged into the region, set the first pixel in the upper left corner of the image to be (x0, y0);

(2b)将像素(x0,y0)为中心的8邻域内任一像素(x,y)的像素值与门限阈值Th=45进行比较,若像素(x,y)的像素值小于门限阈值Th,则将像素(x,y)与(x0,y0)合并成为一个区域,同时将像素(x,y)压入堆栈;(2b) Compare the pixel value of any pixel (x, y) in the 8 neighborhoods centered on the pixel (x0, y0) with the threshold threshold Th=45, if the pixel value of the pixel (x, y) is less than the threshold threshold Th , the pixel (x, y) and (x0, y0) are merged into one area, and the pixel (x, y) is pushed into the stack at the same time;

(2c)从堆栈中取出一个像素,把它当作(x0,y0)返回到步骤(2b);(2c) Take a pixel from the stack and return it to step (2b) as (x0, y0);

(2d)当堆栈为空时,对图像顺序扫描,找到待分割SAR图像中第一个还没有被合并到区域的像素,设该像素为(x0,y0),返回到步骤(2b);(2d) When the stack is empty, scan the image sequentially, find the first pixel in the SAR image to be segmented that has not been merged into the area, set the pixel as (x0, y0), and return to step (2b);

(2e)重复步骤(2b)到(2d)直到待分割SAR图像中的每个像素都被合并到某一个区域中,得到多个区域;(2e) Repeat steps (2b) to (2d) until each pixel in the SAR image to be segmented is merged into a certain region to obtain multiple regions;

(2f)将每个区域视为一个超像素,用所有的区域组成超像素集合Y。(2f) Treat each region as a superpixel, and use all regions to form a superpixel set Y.

步骤3,提取所有超像素的四种特征,分别构成四个特征集。Step 3, extract four kinds of features of all superpixels to form four feature sets respectively.

SAR图像具有丰富的幅度和纹理信息,为了使每一个超像素获得更加准确的标记,在对超像素集Y划分前需要对其进行纹理分析,提取每个超像素的四种特征,具体步骤如下:SAR images have rich amplitude and texture information. In order to obtain more accurate labels for each superpixel, it is necessary to perform texture analysis on the superpixel set Y before dividing it, and extract four kinds of features of each superpixel. The specific steps are as follows :

(3a)统计每个超像素的直方图,将任意一个超像素Yj的直方图特征排成一列,得到超像素Yj的直方图特征列向量x1,j,按照超像素集Y中超像素的顺序,将所有超像素的直方图特征列向量依次排列构成直方图特征集X1=(x1,1,x1,2,…,x1,j,…,x1,n),其中,1≤j≤n,n为超像素个数;(3a) Count the histogram of each superpixel, arrange the histogram features of any superpixel Y j in a row, and obtain the histogram feature column vector x 1,j of the superpixel Yj, according to the superpixels in the superpixel set Y In order, arrange the histogram feature column vectors of all superpixels in order to form a histogram feature set X 1 =(x 1,1 ,x 1,2 ,…,x 1,j ,…,x 1,n ), where, 1≤j≤n, n is the number of superpixels;

(3b)对待分割SAR图像的每个像素,在其八邻域内计算出局部二值模式,统计每个超像素的局部二值模式直方图,将任意一个超像素Yj的局部二值模式直方图特征排成一列,得到超像素Yj的局部二值模式直方图特征列向量x2,j,按照超像素集Y中超像素的顺序,将所有超像素的局部二值模式直方图特征列向量依次排列构成局部二值模式直方图特征集X2=(x2,1,x2,2,…,x2,j,…,x2,n),其中,1≤j≤n;(3b) For each pixel of the SAR image to be segmented, calculate the local binary mode in its eight neighborhoods, count the local binary mode histogram of each superpixel, and calculate the local binary mode histogram of any superpixel Y j The graph features are arranged in a column to obtain the local binary mode histogram feature column vector x 2,j of the superpixel Y j , and according to the order of the superpixels in the superpixel set Y, the local binary mode histogram feature column vectors of all superpixels Arranging in sequence to form a local binary pattern histogram feature set X 2 =(x 2,1 ,x 2,2 ,...,x 2,j ,...,x 2,n ), where 1≤j≤n;

(3c)计算出5个尺度、8个方向共40种Gabor滤波器,构成Gabor滤波器组,使用Gabor滤波器组对待分割SAR图像进行滤波,并计算每个超像素的每种滤波器滤波结果的平均值,将任意一个超像素Yj的40个Gabor滤波结果平均值排成一列,得到超像素Yj的Gabor滤波器组特征列向量x3,j,按照超像素集Y中超像素的顺序,将所有超像素的Gabor滤波器组特征列向量依次排列构成Gabor滤波器组特征集X3=(x3,1,x3,2,…,x3,j,…,x3,n),其中,1≤j≤n;(3c) Calculate a total of 40 Gabor filters in 5 scales and 8 directions to form a Gabor filter bank, use the Gabor filter bank to filter the SAR image to be segmented, and calculate the filtering results of each filter for each superpixel Arrange the average value of 40 Gabor filter results of any superpixel Y j in a row to obtain the Gabor filter group feature column vector x 3,j of superpixel Y j , according to the order of superpixels in superpixel set Y , arrange the Gabor filter bank feature column vectors of all superpixels in order to form a Gabor filter bank feature set X 3 =(x 3,1 ,x 3,2 ,…,x 3,j ,…,x 3,n ) , where 1≤j≤n;

(3d)计算每个超像素的灰度共生矩阵,统计灰度共生矩阵的14种概率统计特征:角二阶距、对比度、相关性、平方总和、逆差矩、平均总和、方差总和、熵、熵总和、变差、熵的方差、X方向信息相关性、Y方向信息相关性、最大相关系数,将任意一个超像素Yj的14种概率统计特征排成一列,得到超像素Yj的灰度共生矩阵概率统计特征列向量x4,j,按照超像素集Y中超像素的顺序,将所有超像素的灰度共生矩阵概率统计特征列向量依次排列构成灰度共生矩阵概率统计特征集X4=(x4,1,x4,2,…,x4,j,…,x4,n),其中,1≤j≤n。(3d) Calculate the gray-level co-occurrence matrix of each superpixel, and count 14 kinds of probability and statistical characteristics of the gray-level co-occurrence matrix: angular second-order distance, contrast, correlation, sum of squares, inverse moment, sum of average, sum of variance, entropy, The sum of entropy, variation, variance of entropy, X-direction information correlation, Y-direction information correlation, and maximum correlation coefficient, arrange the 14 kinds of probability and statistical features of any superpixel Y j in a row, and obtain the gray value of superpixel Y j Degree co-occurrence matrix probability and statistics feature column vector x 4,j , according to the order of the superpixels in the superpixel set Y, arrange the gray-level co-occurrence matrix probability and statistics feature column vectors of all superpixels in sequence to form the gray-level co-occurrence matrix probability and statistics feature set X 4 =(x 4,1 ,x 4,2 ,...,x 4,j ,...,x 4,n ), where 1≤j≤n.

步骤4,对所有超像素的四个特征集进行联合稀疏表示。Step 4, perform joint sparse representation on the four feature sets of all superpixels.

为了将四种特征有效的结合起来,本发明利用了四种特征间的交叉信息,使得同一超像素的不同特征具有一致的稀疏性,对所有超像素的上述四个特征集X1,X2,X3,X4进行联合稀疏表示,得到四个稀疏表示系数Z1,Z2,Z3,Z4,该四个稀疏表示系数通过求解如下公式获得:In order to effectively combine the four features, the present invention utilizes the cross information among the four features, so that the different features of the same superpixel have consistent sparsity. For the above four feature sets X 1 , X 2 of all superpixels , X 3 , and X 4 are jointly sparsely represented to obtain four sparse representation coefficients Z 1 , Z 2 , Z 3 , and Z 4 , which are obtained by solving the following formula:

s.t.diag(Zf)=0,f=1,…,m,stdiag(Z f )=0, f=1,...,m,

其中,Xf=[xf,1,xf,2,…,xf,j,…,xf,n]表示第f种特征集合,xf,j是超像素集Y中第j个超像素的特征,n是超像素个数;α>0和β>0是两个权衡各部分作用的参数,其取值为α=0.5,β=0.3,m=4;||·||F是矩阵的F范数,是一个保真项;||Zf||col,1为Zf每列的1范数;当f=1时特征集合X1的稀疏表示系数为Z1,当f=2时特征集合X2的稀疏表示系数为Z2,当f=3时特征集合X3的稀疏表示系数为Z3,当f=4时特征集合X4的稀疏表示系数为Z4,Z为四个稀疏表示系数Z1,Z2,Z3,Z4的联合稀疏表示系数,其具体形式如下:Among them, X f =[x f,1 ,x f,2 ,…,x f,j ,…,x f,n ] represents the fth feature set, and x f,j is the jth superpixel set Y The characteristics of superpixels, n is the number of superpixels; α>0 and β>0 are two parameters that weigh the effects of each part, and their values are α=0.5, β=0.3, m=4; ||·|| F is the F-norm of the matrix, is a fidelity item; ||Z f || col,1 is the 1 norm of each column of Z f ; when f=1, the sparse representation coefficient of the feature set X 1 is Z 1 , and when f=2, the feature set X The sparse representation coefficient of 2 is Z 2 , when f=3, the sparse representation coefficient of feature set X 3 is Z 3 , when f=4, the sparse representation coefficient of feature set X 4 is Z 4 , Z is four sparse representation coefficients The joint sparse representation coefficient of Z 1 , Z 2 , Z 3 , and Z 4 has the following specific form:

式中||Z||1,2为联合稀疏表示系数Z的1,2范数,其中Zij为Z中第i行的第j个元素,1≤i≤n,1≤j≤n。where ||Z|| 1,2 is the 1,2 norm of the joint sparse representation coefficient Z, Where Z ij is the jth element of the i-th row in Z, 1≤i≤n, 1≤j≤n.

步骤5,使用交替方向乘子算法计算四个稀疏表示系数。Step 5. Calculate the four sparse representation coefficients using the Alternating Direction Multiplier Algorithm.

为计算出四个稀疏表示系数Z1,Z2,Z3,Z4,使用交替方向乘子算法ADMM求解公式<1>,将公式<1>转换为以下的等价形式:In order to calculate the four sparse representation coefficients Z 1 , Z 2 , Z 3 , Z 4 , use the alternating direction multiplier algorithm ADMM to solve the formula <1>, and convert the formula <1> into the following equivalent form:

s.t.Zf=Hf,diag(Hf)=0,Zf=Jf,f=1,…,m,stZ f =H f , diag(H f )=0, Z f =J f , f=1,...,m,

将以上等价形式变换为拉格朗日函数,从而将求解公式<1>变为求解如下拉格朗日函数的最小化:Transform the above equivalent form into a Lagrangian function, so that the solution formula <1> becomes the minimization of the following Lagrangian function:

其中,H1,…,Hm为第一辅助变量,J1,…,Jm为第二辅助变量,U1,…,Um为第一拉格朗日乘子,V1,…,Vm为第二拉格朗日乘子,μ>0是惩罚参数。具体的求解步骤如下:Among them, H 1 ,…,H m are the first auxiliary variables, J 1 ,…,J m are the second auxiliary variables, U 1 ,…,U m are the first Lagrangian multipliers, V 1 ,…, V m is the second Lagrangian multiplier, and μ>0 is a penalty parameter. The specific solution steps are as follows:

(5a)初始化各项参数:设收敛阈值η=10-5,放缩因子θ=1.1,惩罚参数μ=10-6,Zf=Hf=Jf=Uf=Vf=0;(5a) Initialize various parameters: set convergence threshold η=10 -5 , scaling factor θ=1.1, penalty parameter μ=10 -6 , Z f =H f =J f =U f =V f =0;

(5b)使用软阈值方法对第一辅助变量H1,…,Hm的每列进行更新,设hf,e=(hf,1e,hf,2e,…,hf,ne)T为Hf(f=1,…,m)的第e列,为了满足约束条件diag(Hf)=0,去除hf,e中的第e个元素得到辅助列向量再按照下式对该辅助列向量进行更新,在更新后的辅助列向量中的第e个位置插入0,得到更新后的第一辅助变量Hf中的第e列hf,e=(hf,1e,hf,2e,…,hf,(e-1)e,0,hf,(e+1)e,…,hf,ne)T(5b) Use the soft threshold method to update each column of the first auxiliary variable H 1 ,…,H m , set h f,e =(h f,1e ,h f,2e ,…,h f,ne ) T is the e-th column of H f (f=1,...,m), in order to satisfy the constraint condition diag(H f )=0, remove the e-th element in h f, e to get the auxiliary column vector Then the auxiliary column vector according to the following formula Updates are performed on the updated auxiliary column vector Insert 0 at the e-th position in , and get the e-th column h f,e in the updated first auxiliary variable H f =(h f,1e ,h f,2e ,…,h f,(e-1) e ,0,h f,(e+1)e ,…,h f,ne ) T :

其中,是去除稀疏表示系数Zf第e列的第e个元素得到的稀疏表示系数列向量,是去除第一拉格朗日乘子Uf第e列的第e个元素得到的拉格朗日乘子列向量;in, is the sparse representation coefficient column vector obtained by removing the e-th element of the e-th column of the sparse representation coefficient Zf, Is the Lagrangian multiplier column vector obtained by removing the e-th element of the e-th column of the first Lagrange multiplier U f ;

(5c)按照下式更新第二辅助变量J1,…,Jm(5c) Update the second auxiliary variables J 1 ,...,J m according to the following formula:

其中,f=1,…,m,I是尺寸为n×n的单位矩阵;Wherein, f=1,...,m, I is the identity matrix that size is n * n;

(5d)更新联合稀疏表示系数Z的所有行,设zi为Z的第i行,按照下式更新zi(5d) Update all rows of the joint sparse representation coefficient Z, let z i be the ith row of Z, update z i according to the following formula:

其中,||qi||2是qi的2范数,qi是中间变量Q的第i行,Q是一个n2×m的矩阵,其具体形式如下:Among them, ||q i || 2 is the 2-norm of q i , q i is the ith row of the intermediate variable Q, and Q is an n 2 ×m matrix, and its specific form is as follows:

其中,Yf=(Hf+Jf-(Uf+Vf)/μ)/2,f=1,…,m;Wherein, Y f =(H f +J f -(U f +V f )/μ)/2, f=1,...,m;

(5e)对第一拉格朗日乘子U1,…,Um和第二拉格朗日乘子V1,…,Vm,及惩罚参数μ按照下式进行更新:(5e) Update the first Lagrangian multipliers U 1 ,…,U m and the second Lagrangian multipliers V 1 ,…,V m , and the penalty parameter μ according to the following formula:

Uf=Uf+μ(Zf-Hf),f=1,…,m,U f =U f +μ(Z f -H f ), f=1,...,m,

Vf=Vf+μ(Zf-Jf),f=1,…,m,V f =V f +μ(Z f -J f ), f=1,...,m,

μ=min(θμ,1010);μ=min(θμ,10 10 );

(5f)将联合稀疏表示系数Z的每个n2×1的列向量转换为n×n的矩阵,则Z的第一列被转换为矩阵Z1,Z的第二列被转换为矩阵Z2,Z的第三列被转换为矩阵Z3,Z的第四列被转换为矩阵Z4,从而获得四个稀疏表示系数Z1,Z2,Z3,Z4(5f) Convert each n 2 ×1 column vector of the joint sparse representation coefficient Z into an n×n matrix, then the first column of Z is converted into matrix Z 1 , and the second column of Z is converted into matrix Z 2 , the third column of Z is converted into matrix Z 3 , and the fourth column of Z is converted into matrix Z 4 , thereby obtaining four sparse representation coefficients Z 1 , Z 2 , Z 3 , Z 4 ;

(5g)将Zf-Hf的无穷范数||Zf-Hf||和Zf-Jf的无穷范数||Zf-Jf||分别与收敛阈值η进行比较,若||Zf-Hf||和||Zf-Jf||都小于收敛阈值η,则同时获取四个稀疏表示系数Z1,Z2,Z3,Z4,否则返回到步骤(5b)。(5g) Compare the infinite norm of Z f −H f ||Z f −H f || and the infinite norm of Z f −J f ||Z f −J f || respectively with the convergence threshold η , if both ||Z f -H f || and ||Z f -J f || are smaller than the convergence threshold η, then obtain four sparse representation coefficients Z 1 , Z 2 , Z 3 , Z 4 at the same time, otherwise Return to step (5b).

步骤6,融合四个稀疏表示系数得到一个全局稀疏表示系数。Step 6, fusing the four sparse representation coefficients to obtain a global sparse representation coefficient.

为了得到一个统一的全局稀疏表示系数,需将四种特征的稀疏表示系数Z1,Z2,Z3,Z4进行融合,在四个n×n大小的稀疏表示系数Z1,Z2,Z3,Z4中依次分别取任意相同位置的元素[Z1]ij,[Z2]ij,[Z3]ij,[Z4]ij,其中,1≤i≤n,1≤j≤n,将这四个元素依照下式进行融合,得到全局稀疏表示系数中相同位置的元素:In order to obtain a unified global sparse representation coefficient, the sparse representation coefficients Z 1 , Z 2 , Z 3 , and Z 4 of the four features need to be fused, and the four n×n sparse representation coefficients Z 1 , Z 2 , In Z 3 , Z 4 , take any elements [Z1] ij , [Z2] ij , [Z3] ij , [Z4] ij in turn at the same position, among them, 1≤i≤n, 1≤j≤n, the The four elements are fused according to the following formula to obtain the elements at the same position in the global sparse representation coefficient:

对四个稀疏表示系数中的每个元素都按照上式进行融合,得到全局稀疏表示系数Each element of the four sparse representation coefficients is fused according to the above formula to obtain the global sparse representation coefficient

步骤7,计算超像素的局部空间邻域相关性。Step 7, calculate the local spatial neighborhood correlation of superpixels.

全局稀疏表示系数S仅体现了超像素的全局相似性,为了全面探索超像素间的关系,本发明还挖掘了超像素的局部空间邻域信息,即通过一种简单的方式来构造超像素的局部空间邻域相关性C,该局部空间邻域相关性C是一个以超像素个数为行数和列数的方阵,对于任意两个超像素Yi与Yj,当Yi与Yj相邻时,C中第i行的第j个元素Cij的值为1,否则Cij的值为0。The global sparse representation coefficient S only reflects the global similarity of superpixels. In order to comprehensively explore the relationship between superpixels, the present invention also mines the local spatial neighborhood information of superpixels, that is, constructs superpixels in a simple way. Local spatial neighborhood correlation C, the local spatial neighborhood correlation C is a square matrix with the number of superpixels as the number of rows and columns, for any two superpixels Y i and Y j , when Y i and Y When j is adjacent, the value of the jth element C ij in the i row in C is 1, otherwise the value of C ij is 0.

步骤8,生成联合稀疏图。Step 8, generate a joint sparse graph.

本发明将全局联合稀疏表示系数S与局部空间邻域相关性C依照下式进行结合:The present invention combines the global joint sparse representation coefficient S with the local spatial neighborhood correlation C according to the following formula:

P=S·exp(C/2σ2),P=S·exp(C/2σ 2 ),

其中,σ是一个正则参数,P即为一个有向图的邻接矩阵,为了保持信息在图的顶点间传播的稳定性,通常将其转化为联合稀疏图,该联合稀疏图的邻接矩阵为从而获得体现全局结构和局部结构的联合稀疏图。Among them, σ is a regular parameter, and P is the adjacency matrix of a directed graph. In order to maintain the stability of information spreading between the vertices of the graph, it is usually transformed into a joint sparse graph. The adjacency matrix of the joint sparse graph is A joint sparse graph embodying global structure and local structure is thus obtained.

步骤9,使用谱聚类中Ncuts算法对联合稀疏图进行分割。Step 9, use the Ncuts algorithm in spectral clustering to segment the joint sparse graph.

谱聚类以图论中的谱图理论为基础,将超像素集合Y的分割问题转化为与超像素集合Y对应联合稀疏图的最优划分问题。本发明中采用规范切Ncuts算法将超像素集合Y中的元素划分为多类,以两类问题为例,通过下式的最小化将超像素集合Y划分为两个不相交的子集A和B:Based on the spectral graph theory in graph theory, spectral clustering transforms the segmentation problem of superpixel set Y into the optimal partition problem of the joint sparse graph corresponding to superpixel set Y. In the present invention, the elements in the superpixel set Y are divided into multiple categories using the canonical cut Ncuts algorithm. Taking two types of problems as an example, the superpixel set Y is divided into two disjoint subsets A and B:

其中,Gab为联合稀疏图邻接矩阵G中第a行的第b个元素,Gay为联合稀疏图邻接矩阵G中第a行的第y个元素,Gby为联合稀疏图邻接矩阵G中第b行的第y个元素。in, G ab is the b-th element of row a in the joint sparse graph adjacency matrix G, G ay is the y-th element in the a-th row of the joint sparse graph adjacency matrix G, G by is the b-th element in the joint sparse graph adjacency matrix G The yth element of the row.

本发明的效果可以通过如下仿真实验进行验证:Effect of the present invention can be verified by following simulation experiments:

1.实验条件设置1. Experimental condition setting

本发明中SAR图像划分类别数k取值为2和3,分别在两幅SAR图像上进行了验证。In the present invention, the number of categories k of SAR images is divided into 2 and 3, which are verified on two SAR images respectively.

当k=2时,二类待分割SAR图像由RADARSAT卫星拍摄,反映的是加拿大渥太华地区受雨季影响其地表变化情况,拍摄时间为1997年8月,大小为313×352,它包括陆地和水域两个部分,由于陆地部分高度不同,对应SAR图像上的灰度值十分相近,一般分割方法很难将两个区域精确划分。When k=2, the second type of SAR image to be segmented is taken by RADARSAT satellite, which reflects the surface changes in the Ottawa area of Canada affected by the rainy season. The shooting time is August 1997, and the size is 313×352. It includes land and water The two parts, due to the different heights of the land part, correspond to very similar gray values on the SAR image, and it is difficult to divide the two regions accurately by general segmentation methods.

当k=3时,三类待分割SAR图像是美国里奥格兰德河附近的Ku-波段SAR图像,大小为200×180,它包括水域、耕地和植被,由于耕地和植被部分的对比度差异较小,植被很难被区分出来。When k=3, the three types of SAR images to be segmented are Ku-band SAR images near the Rio Grande River in the United States, with a size of 200×180, which includes water, cultivated land and vegetation, due to the contrast difference between cultivated land and vegetation Smaller, vegetation is difficult to distinguish.

本发明中所用过分割门限阈值Th=45,公式<1>中参数α=0.5,β=0.3,m=4,联合稀疏图正则参数σ=1.4。The over-segmentation threshold Th=45 used in the present invention, the parameters α=0.5, β=0.3, m=4 in the formula <1>, and the joint sparse graph regularization parameter σ=1.4.

2.仿真内容及结果2. Simulation content and results

A)用现有的低秩图、L1图以及本发明方法对二类SAR图像进行仿真实验,其分割结果如图2所示。其中图2(a)为二类SAR图像,图2(b)为基于低秩图的二类SAR图像分割结果,图2(c)为基于L1图的二类SAR图像分割结果,图2(d)为本发明的二类SAR图像分割结果。A) Using the existing low-rank map, L1 map and the method of the present invention to carry out simulation experiments on two types of SAR images, the segmentation results are shown in Figure 2. Among them, Figure 2(a) is the second-class SAR image, Figure 2(b) is the segmentation result of the second-class SAR image based on the low-rank image, Figure 2(c) is the segmentation result of the second-class SAR image based on the L1 image, and Figure 2( d) is the segmentation result of the second-class SAR image of the present invention.

从图2(b)可以看出,基于低秩图的二类SAR图像分割结果虽然大体分割出了水域的轮廓,但陆地部分有很多杂点。It can be seen from Fig. 2(b) that although the segmentation result of the second-class SAR image based on the low-rank map generally shows the outline of the water area, there are many noise points in the land part.

从图2(c)可以看出,基于L1图的二类SAR图像分割结果相比于基于低秩图的二类SAR图像分割结果有了较好的区域一致性,但水域周边的陆地部分存在很多误分。It can be seen from Figure 2(c) that the classification results of Class II SAR images based on the L1 map have better regional consistency than those based on the low rank map, but the land parts around the water area have Lots of misclassifications.

从图2(d)可以看出,本发明由于将全局结构与局部结构相结合,二类SAR图像分割结果明显地呈现水域和陆地两部分,并且各部分都具有很好的区域一致性,这也是符合地面监测需要的。It can be seen from Fig. 2(d) that due to the combination of the global structure and the local structure in the present invention, the segmentation result of the second-class SAR image clearly presents two parts of water and land, and each part has a good regional consistency, which is It also meets the needs of ground monitoring.

B)用现有的低秩图、L1图以及本发明对三类SAR图像进行仿真实验,其分割结果如图3所示。其中图3(a)为三类SAR图像,图3(b)为基于低秩图的三类SAR图像分割结果,图3(c)为基于L1图的三类SAR图像分割结果,图3(d)为本发明的三类SAR图像分割结果。B) Using the existing low-rank map, L1 map and the present invention to conduct simulation experiments on three types of SAR images, the segmentation results are shown in FIG. 3 . Among them, Figure 3(a) is the three types of SAR images, Figure 3(b) is the segmentation results of three types of SAR images based on low-rank images, Figure 3(c) is the segmentation results of three types of SAR images based on L1 images, Figure 3( d) is the segmentation result of three types of SAR images in the present invention.

从图3(b)可以看出,基于低秩图的三类SAR图像分割结果虽然分割出了少量的植被,却未区分出大面积的水域和耕地。It can be seen from Fig. 3(b) that although the segmentation results of the three types of SAR images based on the low-rank map segmented a small amount of vegetation, it did not distinguish large areas of water and cultivated land.

从图3(c)可以看出,基于L1图的三类SAR图像分割结果相比于基于低秩图的三类SAR图像分割结果,能基本将水域、植被和耕地分割开,但水域周边的植被部分存在误分。It can be seen from Fig. 3(c) that the segmentation results of the three types of SAR images based on the L1 map can basically separate the water area, vegetation and cultivated land compared with the results of the three types of SAR image segmentation based on the low-rank image. There is misclassification in the vegetation part.

从图3(d)可以看出,本发明的三类SAR图像分割结果中将水域、植被和耕地较为准确的进行了划分。It can be seen from FIG. 3( d ) that the water area, vegetation and cultivated land are more accurately divided in the three types of SAR image segmentation results of the present invention.

Claims (5)

1.基于多特征联合稀疏图的SAR图像分割方法,包括如下步骤:1. The SAR image segmentation method based on multi-feature joint sparse graph, comprises the following steps: (1)输入待分割SAR图像,确定需要将图像划分的类数k≥2;(1) Input the SAR image to be segmented, and determine the number of categories k ≥ 2 that need to be divided into the image; (2)使用区域生长算法对输入的待分割SAR图像进行过分割,生成多个表示同质区域的超像素集Y;(2) Use the region growing algorithm to over-segment the input SAR image to be segmented, and generate multiple superpixel sets Y representing homogeneous regions; (3)对超像素集Y中的每个超像素提取其灰度直方图特征、局部二值模式直方图特征、Gabor滤波器组特征、灰度共生矩阵统计特征,构成灰度直方图特征集X1、局部二值模式直方图特征集X2、Gabor滤波器组特征集X3、灰度共生矩阵统计特征集X4(3) For each superpixel in the superpixel set Y, extract its grayscale histogram feature, local binary mode histogram feature, Gabor filter bank feature, and grayscale co-occurrence matrix statistical feature to form a grayscale histogram feature set X 1 , local binary pattern histogram feature set X 2 , Gabor filter bank feature set X 3 , gray level co-occurrence matrix statistical feature set X 4 ; (4)对所有超像素的上述四个特征集X1,X2,X3,X4进行联合稀疏表示,分别得到四个特征集合的稀疏表示系数Z1,Z2,Z3,Z4,每个稀疏表示系数的大小为n×n,n为超像素个数;(4) Jointly sparsely represent the above four feature sets X 1 , X 2 , X 3 , and X 4 of all superpixels, and obtain the sparse representation coefficients Z 1 , Z 2 , Z 3 , and Z 4 of the four feature sets respectively , the size of each sparse representation coefficient is n×n, where n is the number of superpixels; (5)在四个n×n大小的稀疏表示系数Z1,Z2,Z3,Z4中依次分别取任意相同位置的元素[Z1]ij,[Z2]ij,[Z3]ij,[Z4]ij,其中,1≤i≤n,1≤j≤n,将这四个元素融合,得到全局稀疏表示系数中相同位置的元素对四个稀疏表示系数中的每个元素都按照上式进行融合,得到全局稀疏表示系数 (5) In the four sparse representation coefficients Z 1 , Z 2 , Z 3 , Z 4 of size n×n, respectively take any element [Z 1 ] ij , [Z 2 ] ij , [Z 3 ] at the same position ij ,[Z 4 ] ij , among them, 1≤i≤n, 1≤j≤n, these four elements are fused to obtain the elements at the same position in the global sparse representation coefficient Each element of the four sparse representation coefficients is fused according to the above formula to obtain the global sparse representation coefficient (6)根据超像素的相邻关系构造出超像素的局部空间邻域相关性C,该局部空间邻域相关性C是一个以超像素个数为行数和列数的方阵,对于任意两个超像素Yi与Yj,当Yi与Yj相邻时,C中第i行的第j个元素Cij的值为1,否则Cij的值为0;(6) Construct the local spatial neighborhood correlation C of superpixels according to the adjacent relationship of superpixels. The local spatial neighborhood correlation C is a square matrix with the number of superpixels as the number of rows and columns. For any Two superpixels Y i and Y j , when Y i and Y j are adjacent, the value of the j-th element C ij in the i-th row of C is 1, otherwise the value of C ij is 0; (7)将全局稀疏表示系数S与局部空间邻域相关性C相结合,生成联合稀疏图的邻接矩阵G=(1/2)(P+PT),其中P=S·exp(C/2σ2),T表示转置,σ是一个正则参数,取值为1.4;(7) Combine the global sparse representation coefficient S with the local spatial neighborhood correlation C to generate the adjacency matrix G=(1/2)(P+P T ) of the joint sparse graph, where P=S·exp(C/ 2σ 2 ), T means transpose, σ is a regular parameter with a value of 1.4; (8)使用谱聚类中规范切算法Ncuts,将联合稀疏图的顶点划分为k类,得到最终分割结果。(8) Use the canonical cut algorithm Ncuts in spectral clustering to divide the vertices of the joint sparse graph into k classes to obtain the final segmentation result. 2.根据权利要求1所述的基于多特征联合稀疏图的SAR图像分割方法,其中所述步骤(3)中灰度直方图特征集X1,按如下步骤构成:2. the SAR image segmentation method based on the multi-feature joint sparse graph according to claim 1, wherein in the step (3), the gray histogram feature set X 1 is formed as follows: (3a)统计每个超像素的直方图,将任意一个超像素Yj的直方图特征排成一列,得到超像素Yj的直方图特征列向量x1,j(3a) Count the histogram of each superpixel, arrange the histogram features of any superpixel Y j into a column, and obtain the histogram feature column vector x1 ,j of the superpixel Y j ; (3b)按照超像素集Y中超像素的顺序,将所有超像素的直方图特征列向量依次排列构成直方图特征集X1=(x1,1,x1,2,…,x1,j,…,x1,n),其中,1≤j≤n,n为超像素个数。(3b) According to the order of the superpixels in the superpixel set Y, arrange the histogram feature column vectors of all superpixels in order to form a histogram feature set X 1 =(x 1,1 ,x 1,2 ,…,x 1,j ,…,x 1,n ), where, 1≤j≤n, n is the number of superpixels. 3.根据权利要求1所述的基于多特征联合稀疏图的SAR图像分割方法,其中所述步骤(3)中局部二值模式直方图特征集X2,按如下步骤构成:3. the SAR image segmentation method based on multi-feature joint sparse graph according to claim 1, wherein in the said step (3), the local binary pattern histogram feature set X 2 is formed as follows: (3c)对待分割SAR图像的每个像素,在其八邻域内计算出局部二值模式,统计每个超像素的局部二值模式直方图;(3c) For each pixel of the SAR image to be segmented, calculate the local binary pattern in its eight neighborhoods, and count the local binary pattern histogram of each superpixel; (3d)将任意一个超像素Yj的局部二值模式直方图特征排成一列,得到超像素Yj的局部二值模式直方图特征列向量x2,j(3d) arranging the local binary pattern histogram features of any one superpixel Y j into a column to obtain the local binary pattern histogram feature column vector x 2,j of the superpixel Y j ; (3e)按照超像素集Y中超像素的顺序,将所有超像素的局部二值模式直方图特征列向量依次排列构成局部二值模式直方图特征集X2=(x2,1,x2,2,…,x2,j,…,x2,n),其中,1≤j≤n,n为超像素个数。(3e) According to the order of the superpixels in the superpixel set Y, arrange the local binary pattern histogram feature column vectors of all superpixels in sequence to form the local binary pattern histogram feature set X 2 =(x 2,1 ,x 2, 2 ,…,x 2,j ,…,x 2,n ), where, 1≤j≤n, n is the number of superpixels. 4.根据权利要求1所述的基于多特征联合稀疏图的SAR图像分割方法,其中所述步骤(3)中Gabor滤波器组特征集X3,按如下步骤构成:4. the SAR image segmentation method based on multi-feature joint sparse graph according to claim 1, wherein in said step (3), Gabor filter bank feature set X 3 , constitutes as follows: (3f)计算出5个尺度、8个方向共40种Gabor滤波器,构成Gabor滤波器组,使用Gabor滤波器组对待分割SAR图像进行滤波,并计算每个超像素的每种滤波器滤波结果的平均值;(3f) Calculate a total of 40 Gabor filters in 5 scales and 8 directions to form a Gabor filter bank, use the Gabor filter bank to filter the SAR image to be segmented, and calculate the filtering results of each filter for each superpixel average value; (3g)将任意一个超像素Yj的40个Gabor滤波结果平均值排成一列,得到超像素Yj的Gabor滤波器组特征列向量x3,j(3g) arrange the 40 Gabor filtering result mean values of any superpixel Y j into a row, and obtain the Gabor filter group feature column vector x 3,j of the superpixel Y j ; (3h)按照超像素集Y中超像素的顺序,将所有超像素的Gabor滤波器组特征列向量依次排列构成Gabor滤波器组特征集X3=(x3,1,x3,2,…,x3,j,…,x3,n),其中,1≤j≤n,n为超像素个数。(3h) According to the order of the superpixels in the superpixel set Y, arrange the Gabor filter bank feature column vectors of all superpixels in turn to form the Gabor filter bank feature set X 3 =(x 3,1 ,x 3,2 ,…, x 3,j ,…,x 3,n ), where, 1≤j≤n, n is the number of superpixels. 5.根据权利要求1所述的基于多特征联合稀疏图的SAR图像分割方法,其中所述步骤(3)中灰度共生矩阵统计特征集X4,按如下步骤构成:5. the SAR image segmentation method based on the multi-feature joint sparse graph according to claim 1, wherein in the step (3), the gray level co-occurrence matrix statistical feature set X 4 is formed as follows: (3i)计算每个超像素的灰度共生矩阵,统计灰度共生矩阵的14种概率统计特征:角二阶距、对比度、相关性、平方总和、逆差矩、平均总和、方差总和、熵、熵总和、变差、熵的方差、X方向信息相关性、Y方向信息相关性、最大相关系数;(3i) Calculate the gray-level co-occurrence matrix of each superpixel, and count 14 kinds of probability and statistical characteristics of the gray-level co-occurrence matrix: angular second-order distance, contrast, correlation, sum of squares, inverse moment, sum of average, sum of variance, entropy, Entropy sum, variation, variance of entropy, information correlation in X direction, information correlation in Y direction, maximum correlation coefficient; (3j)将任意一个超像素Yj的14种概率统计特征排成一列,得到超像素Yj的灰度共生矩阵概率统计特征列向量x4,j(3j) Arranging 14 kinds of probability and statistical features of any superpixel Y j in a row to obtain the gray level co-occurrence matrix probability and statistical feature column vector x 4,j of superpixel Y j ; (3k)按照超像素集Y中超像素的顺序,将所有超像素的灰度共生矩阵概率统计特征列向量依次排列构成灰度共生矩阵统计特征集X4=(x4,1,x4,2,…,x4,j,…,x4,n),其中,1≤j≤n,n为超像素个数。(3k) According to the order of the superpixels in the superpixel set Y, arrange the gray-level co-occurrence matrix probability and statistical feature column vectors of all superpixels in sequence to form the gray-level co-occurrence matrix statistical feature set X 4 =(x 4,1 ,x 4,2 ,…,x 4,j ,…,x 4,n ), where, 1≤j≤n, n is the number of superpixels.
CN201410472497.7A 2014-09-16 2014-09-16 SAR image segmentation method based on multiple features combining sparse graph Active CN104217436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410472497.7A CN104217436B (en) 2014-09-16 2014-09-16 SAR image segmentation method based on multiple features combining sparse graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410472497.7A CN104217436B (en) 2014-09-16 2014-09-16 SAR image segmentation method based on multiple features combining sparse graph

Publications (2)

Publication Number Publication Date
CN104217436A CN104217436A (en) 2014-12-17
CN104217436B true CN104217436B (en) 2017-06-16

Family

ID=52098878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410472497.7A Active CN104217436B (en) 2014-09-16 2014-09-16 SAR image segmentation method based on multiple features combining sparse graph

Country Status (1)

Country Link
CN (1) CN104217436B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104732552B (en) * 2015-04-09 2017-07-28 西安电子科技大学 SAR image segmentation method based on nonstationary condition
CN106778814B (en) * 2016-11-24 2020-06-12 郑州航空工业管理学院 Method for removing SAR image spots based on projection spectral clustering algorithm
CN108550131B (en) * 2018-04-12 2020-10-20 浙江理工大学 SAR image vehicle detection method based on feature fusion sparse representation model
CN108664976B (en) * 2018-04-25 2022-06-03 安徽大学 An automatic segmentation method of brain tumor images based on fuzzy spectral clustering based on superpixels
CN108596257A (en) * 2018-04-26 2018-09-28 深圳市唯特视科技有限公司 A kind of preferential scene analytic method in position based on space constraint
CN108921853B (en) * 2018-06-22 2022-03-04 西安电子科技大学 Image segmentation method based on super-pixel and immune sparse spectral clustering
CN113505710B (en) * 2021-07-15 2022-06-03 黑龙江工程学院 A method and system for image selection based on deep learning SAR image classification
CN114067209B (en) * 2021-11-17 2025-04-29 广西师范大学 A method for extracting vegetation information from high-resolution urban remote sensing images

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020654A (en) * 2012-12-12 2013-04-03 北京航空航天大学 Synthetic aperture radar (SAR) image bionic recognition method based on sample generation and nuclear local feature fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013143102A (en) * 2012-01-12 2013-07-22 Nikon Corp Mobile object detection device, mobile object detection method, and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020654A (en) * 2012-12-12 2013-04-03 北京航空航天大学 Synthetic aperture radar (SAR) image bionic recognition method based on sample generation and nuclear local feature fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Context-Based Hierarchical Unequal Merging for SAR Image Segmentation;Hang Yu等;《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》;20130228;第51卷(第2期);第995-1009页 *
应用联合自聚焦实现低信噪比ISAR成像平动补偿;杨磊等;《西安电子科技大学学报(自然科学版)》;20120105;第39卷(第3期);第63-71页 *

Also Published As

Publication number Publication date
CN104217436A (en) 2014-12-17

Similar Documents

Publication Publication Date Title
CN104217436B (en) SAR image segmentation method based on multiple features combining sparse graph
Duan et al. SAR image segmentation based on convolutional-wavelet neural network and Markov random field
CN108573276B (en) A change detection method based on high-resolution remote sensing images
CN108154192B (en) High-resolution SAR terrain classification method based on multi-scale convolution and feature fusion
Ban et al. Object-based fusion of multitemporal multiangle ENVISAT ASAR and HJ-1B multispectral data for urban land-cover mapping
CN104123555B (en) Super-pixel polarimetric SAR land feature classification method based on sparse representation
Zhang et al. Cloud detection in high-resolution remote sensing images using multi-features of ground objects
CN107067405B (en) Remote sensing image segmentation method based on scale optimization
CN102402685B (en) Method for segmenting three Markov field SAR image based on Gabor characteristic
CN103258324B (en) Based on the method for detecting change of remote sensing image that controlled kernel regression and super-pixel are split
CN105608433A (en) Nuclear coordinated expression-based hyperspectral image classification method
CN107229917A (en) A kind of several remote sensing image general character well-marked target detection methods clustered based on iteration
CN107358260A (en) A kind of Classification of Multispectral Images method based on surface wave CNN
CN113569772A (en) Remote sensing image farmland instance mask extraction method, system, equipment and storage medium
CN105335975B (en) Polarization SAR image segmentation method based on low-rank decomposition and statistics with histogram
CN105335965B (en) Multi-scale self-adaptive decision fusion segmentation method for high-resolution remote sensing image
CN103366371A (en) K distribution and texture feature-based SAR (Synthetic Aperture Radar) image segmentation method
Sharma et al. An object-based shadow detection method for building delineation in high-resolution satellite images
Xu et al. Using pan-sharpened high resolution satellite data to improve impervious surfaces estimation
CN107464247B (en) Based on G0Distributed random gradient variational Bayesian SAR image segmentation method
CN109948520A (en) A crop classification method based on multi-temporal dual-polarization SAR characteristic curve
Li et al. Multitemporal SAR images change detection based on joint sparse representation of pair dictionaries
CN106971402A (en) A kind of SAR image change detection aided in based on optics
Jing et al. Time series land cover classification based on semi-supervised convolutional long short-term memory neural networks
CN117809029A (en) Multi-branch high-resolution remote sensing image semantic segmentation method and system based on edge perception

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant