[go: up one dir, main page]

CN115578599A - A Polarimetric SAR Image Classification Method Based on Superpixel-Hypergraph Feature Enhancement Network - Google Patents

A Polarimetric SAR Image Classification Method Based on Superpixel-Hypergraph Feature Enhancement Network Download PDF

Info

Publication number
CN115578599A
CN115578599A CN202211322174.0A CN202211322174A CN115578599A CN 115578599 A CN115578599 A CN 115578599A CN 202211322174 A CN202211322174 A CN 202211322174A CN 115578599 A CN115578599 A CN 115578599A
Authority
CN
China
Prior art keywords
superpixel
feature
polarization
network
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211322174.0A
Other languages
Chinese (zh)
Other versions
CN115578599B (en
Inventor
耿杰
王茹
蒋雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202211322174.0A priority Critical patent/CN115578599B/en
Publication of CN115578599A publication Critical patent/CN115578599A/en
Application granted granted Critical
Publication of CN115578599B publication Critical patent/CN115578599B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

本发明公开了一种基于超像素‑超图特征增强网络的极化SAR图像分类方法,包括以下步骤:步骤一、预处理极化SAR图像的输入数据;步骤二、利用超像素分割得到极化SAR超像素集;步骤三、生成超像素极化特征矩阵和空间特征矩阵,构建极化特征关联矩阵和空间特征关联矩阵;步骤四、构建两层超像素‑超图卷积神经网络,进行超像素级到像素级特征转换;步骤五、构建特征重建网络和局部特征提取网络,融合重建特征与局部特征实现特征的增强;步骤六、利用训练集对网络进行训练,输出分类结果。本发明提出的超像素‑超图特征增强网络,能够利用极化SAR图像的极化特征和空间特征,充分融合全局信息和局部信息,有效提升了极化SAR图像分类的精度。

Figure 202211322174

The invention discloses a polarimetric SAR image classification method based on a superpixel-hypergraph feature enhancement network, comprising the following steps: step 1, preprocessing the input data of the polarimetric SAR image; step 2, using superpixel segmentation to obtain polarization SAR superpixel set; step 3, generate superpixel polarization feature matrix and spatial feature matrix, construct polarization feature correlation matrix and spatial feature correlation matrix; step 4, construct two-layer superpixel-supergraph convolutional neural network, perform super Pixel-level to pixel-level feature conversion; Step 5, constructing a feature reconstruction network and a local feature extraction network, fusing reconstruction features and local features to achieve feature enhancement; Step 6, using the training set to train the network and output classification results. The superpixel-supergraph feature enhancement network proposed by the present invention can fully integrate the global information and local information by using the polarization and spatial features of the polarimetric SAR image, and effectively improve the classification accuracy of the polarimetric SAR image.

Figure 202211322174

Description

Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network
Technical Field
The invention belongs to the field of polarized SAR image processing, and particularly relates to a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network.
Background
The polarized synthetic aperture radar can obtain high-resolution radar images under all weather conditions. The polarized SAR image contains abundant polarization information and can reflect the physical properties of an irradiated object. Therefore, the polarized SAR images have been widely used in the fields of ocean monitoring, urban planning, and geoscience.
With the continuous development of the polarized synthetic aperture radar, the polarized SAR image classification task is also increasingly concerned. Polarimetric SAR image classification is a pixel-level classification task that classifies all pixel points in an image into corresponding classes based on the information of each pixel unit. The existing polarization SAR image classification methods are mainly classified into a polarization decomposition-based classification method, a statistical characteristic-based classification method and a machine learning-based classification method. At present, in the process of classifying the polarized SAR image by using the convolutional neural network, the input of the network is a square sampling block with a fixed size, and a category label corresponding to a central pixel point of the sampling block is used as a label of the whole sampling block. The method only focuses on the local information of the image and ignores the global information of the image, thereby limiting the further improvement of the classification precision. The atlas neural network has the capability of capturing image global information, but due to the problem that similar polarization features exist in different objects, only using polarization feature correlation can lead to erroneous classification of parts of the objects.
Disclosure of Invention
The invention aims to solve the problems that under the deep learning background, the polarized SAR image feature information is not fully utilized, and the classification precision is difficult to further improve, and provides a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network, which has a simple structure and reasonable design, wherein a superpixel segmentation technology is used for segmenting a polarized SAR image into a series of superpixels, the polarized feature relevance and the spatial feature relevance among the superpixels are constructed, and the superpixel-hypergraph convolution neural network is used for extracting the global feature of the polarized SAR image; and a feature reconstruction network and a local feature extraction network are constructed, the global features and the local features of the polarized SAR image are fused, and the image classification precision is further improved.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows: a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network is characterized by comprising the following steps: the method comprises the following steps:
step one, preprocessing input data of a polarized SAR image:
step 101, polarimetric scattering matrix of polarimetric SAR
Figure BDA0003910897700000021
Converting under Pauli base to obtain coherent matrix T, S hh And S vv Representing a component of the same polarization, S hv And S vh Represents cross polarization components, h and v represent horizontal polarization and vertical polarization, respectively;
102, converting the coherent matrix T to obtain a 6-dimensional initial polarization characteristic vector, and further obtaining a size I h ×I w X 6 polarized SAR input data, where I h And I w Respectively representing the length and the width of the polarized SAR image;
step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation:
step 201, segmenting the polarized SAR image into M superpixel blocks by using a superpixel segmentation algorithm of simple linear iterative clustering;
step 202, forming a polarized SAR superpixel set S = { S = { S = } 1 ,…,S i ,…,S M In which S is i Representing the ith superpixel block;
generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic incidence matrix and a spatial characteristic incidence matrix:
step 301, in the polarized SAR super-pixel set S, calculating the mean value of the polarized characteristics of all pixel points in each super-pixel to generate a polarized characteristic matrix
Figure BDA0003910897700000031
Calculating the mean value of the horizontal and vertical coordinates of all pixel points in each super pixel to generate a spatial characteristic matrix
Figure BDA0003910897700000032
Wherein,
Figure BDA0003910897700000033
and
Figure BDA0003910897700000034
respectively representing super-pixels S i The polarization eigenvectors and the spatial eigenvectors of (a);
step 302, in the super-pixel set S, calculating the similarity between the polarization characteristics of the super-pixels by using a polarization characteristic matrix, for each super-pixel, selecting k values with the highest similarity by using a k-nearest neighbor algorithm, setting the rest similarity values to be 0, and generating a polarization characteristic association matrix H pol ∈R M×M
Step 303, in the superpixel set S, calculating the similarity between spatial features of the superpixels by using the spatial feature matrix, finding out the similarity value of each superpixel with the adjacent superpixel in the spatial position, setting the similarity values of the rest spatial features as 0, and generating a spatial feature correlation matrix H spa ∈R M×M
Step four, constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion:
step 401, constructing a two-layer superpixel-hypergraph convolution neural network, and associating a polarization characteristic matrix H pol Spatial feature correlation matrix H spa Polarization feature matrix X pol And spatial feature matrix X spa Inputting the super-pixel characteristics into the network, and learning to a higher level by using the network
Figure BDA0003910897700000041
Wherein D is c A feature dimension representing a network output superpixel;
step 402, utilizing a superpixel-to-pixel conversion matrix
Figure BDA0003910897700000042
Superpixel features output by superpixel-hypergraph convolutional neural network
Figure BDA0003910897700000043
Conversion to pixel level features
Figure BDA0003910897700000044
Wherein, I h I w Representing the number of all pixel points of the polarized SAR image;
step five, constructing a feature reconstruction network and a local feature extraction network, and fusing reconstruction features and local features to realize feature enhancement:
step 501, constructing a feature reconstruction network, inputting the obtained pixel-level features into the feature reconstruction network, wherein the feature reconstruction expression is
Figure BDA0003910897700000045
Wherein,
Figure BDA0003910897700000046
represents the output of the l-th hidden layer, an
Figure BDA0003910897700000047
b (l) Representation of the feature reconstruction network bias, f rec () represents an activation function of the feature reconstruction network;
step 502, constructing a local feature extraction network, and extracting local feature X of the polarized SAR image local The local feature extraction expression is
Figure BDA0003910897700000048
Wherein,
Figure BDA0003910897700000049
local features extracted by the layer I network are shown,
Figure BDA00039108977000000410
which represents the kernel of the convolution,
Figure BDA00039108977000000411
representing local feature extraction network bias, f (-) represents an activation function of the feature extraction network;
step 503, reconstructing the output of the network from the features
Figure BDA00039108977000000412
And local feature X local Splicing is carried out, the enhancement of the characteristics is realized, and the overall characteristics X finally used for classification are obtained total
Step six, training the network by using a training set, and outputting a classification result:
step 601, integrating the overall characteristics X total Inputting the prediction result into a Softmax classifier to obtain a prediction result P of each pixel point in the image jc
Step 602, randomly selecting training samples with a proportion r to form a training set for training the network, wherein a loss function in the network training process is L = alpha L rec +L c Wherein L is rec Is the reconstruction loss, L c Is the classification loss and alpha is the balance parameter.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the formula of the coherence matrix T obtained by converting the polarized scattering matrix S under Pauli basis in step 101 is as follows:
Figure BDA0003910897700000051
the polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the calculation formula of the 6-dimensional initial polarization eigenvector obtained by the conversion of the coherence matrix T in step 102 is as follows:
f 1 =10log 10 (T 11 +T 22 +T 33 )
f 2 =T 22 /(T 11 +T 22 +T 33 )
f 3 =T 33 /(T 11 +T 22 +T 33 )
Figure BDA0003910897700000052
Figure BDA0003910897700000053
Figure BDA0003910897700000054
wherein, T ij (i =1,2,3, j =1,2,3) represents an element corresponding to the ith row and the jth column of the matrix T, f is i (i =1, …, 6) represents a polarization characteristic value of the i-th dimension.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: generating a polarization signature correlation matrix H in step 302 pol The calculation formula of (a) is as follows:
Figure BDA0003910897700000061
wherein h is pol (i, j) represents a polarization characteristic correlation matrix H pol The element of row i, column j, beta represents an adjustable parameter,
Figure BDA0003910897700000062
representing a super-pixel S i The 6-dimensional polarization feature vector, KNN (·) represents the k-nearest neighbor of the polarization feature.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of:generating a spatial feature correlation matrix H in step 303 spa The calculation formula of (a) is as follows:
Figure BDA0003910897700000068
wherein h is spa (i, j) represents a spatial feature correlation matrix H spa The element of row i, column j, gamma denotes an adjustable parameter,
Figure BDA0003910897700000063
the 2-dimensional spatial feature vector representing the superpixel i, neighbor (·) represents the spatial neighborhood.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the propagation rule of the superpixel-hypergraph convolutional neural network in step 401 is as follows:
Figure BDA0003910897700000064
where σ (·) is the ReLU activation function, W is the trainable transfinite weight, H fuse Correlation matrix H by polarization characteristics pol And spatial feature correlation matrix H spa The components are spliced to form the composite material,
Figure BDA0003910897700000065
and
Figure BDA0003910897700000066
respectively representing input characteristics and output characteristics of the l-th layer superpixel-hypergraph convolutional neural network, and initial input characteristics
Figure BDA0003910897700000067
From a polarization characteristic matrix X pol And spatial feature matrix X spa Formed by splicing D v =∑ e∈E W(e)H fuse (v, e) denotes the degree of super-graph edge, D e =∑ v∈V H fuse (v, e) represents the degree of the super edge node, Θ (l) Representing trainableAnd (5) filtering the matrix.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: superpixel-to-pixel conversion matrix in step 402
Figure BDA0003910897700000071
The calculation formula of (a) is as follows:
Figure BDA0003910897700000072
where Q (I, j) represents an element of the ith row and jth column in the super pixel-to-pixel conversion matrix Q, and I =1 h ×I w ,j=1,…,M,p i Representing the ith pixel point of the PolSAR image, S j Representing the jth super pixel.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: reconstruction of loss L in step 602 rec And a classification loss L c The calculation formulas of (A) are respectively as follows:
Figure BDA0003910897700000073
Figure BDA0003910897700000074
wherein N represents the number of training samples, C represents the number of classes, Y jc True value, P, of class c jc Representing a predictive label.
Compared with the prior art, the invention has the following advantages:
the method can fully utilize the polarization characteristics and the spatial characteristics of the polarized SAR image, and extract the global characteristics of the image by using the superpixel-hypergraph convolution neural network; a feature reconstruction network and a local feature extraction network are constructed, the global features and the local features of the polarized SAR image are fused, and the classification accuracy of the polarized SAR image is effectively improved. The method of the invention has simple structure and convenient realization, use and operation.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is an image of the area of Oberpfaffenhofen Germany used in the simulation of the present invention;
FIG. 3 is a diagram illustrating the classification effect obtained in the simulation of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For ease of description, spatially relative terms such as "over … …", "over … …", "over … …", "over", etc. may be used herein to describe the spatial positional relationship of one device or feature to another device or feature as shown in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" may include both orientations of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
As shown in fig. 1, the polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network of the present invention is characterized in that: the method comprises the following steps:
step one, preprocessing input data of a polarized SAR image:
step 101, polarimetric scattering matrix of polarimetric SAR
Figure BDA0003910897700000091
Converting under Pauli base to obtain coherent matrix T, S hh And S vv Representing a homopolar component, S hv And S vh And (3) expressing cross polarization components, and h and v respectively expressing horizontal polarization and vertical polarization, wherein a calculation formula of a coherent matrix T obtained by converting a polarization scattering matrix S in a Pauli base is as follows:
Figure BDA0003910897700000101
step (ii) of102. The coherent matrix T is converted to obtain 6-dimensional initial polarization eigenvector, and then the size I is obtained h ×I w X 6 polarized SAR image input data, wherein I h And I w Respectively representing the length and the width of the polarized SAR image, and the calculation formula of the 6-dimensional initial polarization eigenvector obtained by the conversion of the coherence matrix T is as follows:
f 1 =10log 10 (T 11 +T 22 +T 33 )
f 2 =T 22 /(T 11 +T 22 +T 33 )
f 3 =T 33 /(T 11 +T 22 +T 33 )
Figure BDA0003910897700000102
Figure BDA0003910897700000103
Figure BDA0003910897700000104
wherein, T ij (i =1,2,3, j =1,2,3) represents an element corresponding to the ith row and the jth column of the matrix T, f is i (i =1, …, 6) represents a polarization characteristic value of i-dimension;
step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation:
step 201, for the I obtained in the step one h ×I w The polarized SAR image of x 6 is segmented into M superpixel blocks by using a superpixel segmentation algorithm of simple linear iterative clustering;
step 202 forming polarized SAR superordinate set of pixels S = { S = { S = 1 ,…,S M },S i Representing the ith superpixel block;
in specific implementation, the number M of the superpixel blocks is determined by the size of an input image and the segmentation size set in the process of using the simple linear iterative clustering superpixel segmentation method;
generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic incidence matrix and a spatial characteristic incidence matrix:
step 301, in the polarized SAR super-pixel set S, calculating the average value of the polarization characteristics of all pixel points in each super-pixel to generate a polarization characteristic matrix
Figure BDA0003910897700000111
Calculating the mean value of the horizontal and vertical coordinates of all pixel points in each super pixel to generate a spatial characteristic matrix
Figure BDA0003910897700000112
Wherein,
Figure BDA0003910897700000113
and
Figure BDA0003910897700000114
respectively representing super-pixels S i The polarization eigenvectors and the spatial eigenvectors of (a);
step 302, in the super-pixel set S, calculating the similarity between the polarization features of the super-pixels by using a polarization feature matrix, selecting the k values with the highest similarity by using a k-nearest neighbor algorithm for each super-pixel, setting the rest similarity values to be 0, and generating a polarization feature association matrix H pol ∈R M×M Polarization characteristic correlation matrix H pol The calculation formula of (a) is as follows:
Figure BDA0003910897700000115
wherein h is pol (i, j) represents a polarization characteristic correlation matrix H pol The element of row i, column j, beta represents an adjustable parameter,
Figure BDA0003910897700000116
representing a super-pixel S i The 6-dimensional polarization feature vector of (1), KNN (-) represents the k-nearest neighbor of the polarization feature;
in specific implementation, the neighbor number k set by the k neighbor method is 3, and the adjustable parameter beta is set to be 100;
step 303, in the super-pixel set S, calculating the similarity between the spatial features of the super-pixels by using the spatial feature matrix, finding out the similarity value of each super-pixel with the adjacent super-pixels in the spatial position, setting the similarity values of the rest spatial features as 0, and generating a spatial feature correlation matrix H spa ∈R M×M Polarization characteristic correlation matrix H spa The calculation formula of (a) is as follows:
Figure BDA0003910897700000121
wherein h is spa (i, j) represents a spatial feature correlation matrix H spa The ith row and the jth column in the drawing, gamma represents an adjustable parameter,
Figure BDA0003910897700000122
a 2-dimensional spatial feature vector representing a superpixel i, neighbor (·) representing spatial neighbors;
in specific implementation, the adjustable parameter gamma is set to 80;
step four, constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion:
step 401, constructing a two-layer superpixel-hypergraph convolution neural network, and associating a polarization characteristic matrix H pol Spatial feature correlation matrix H spa Polarization feature matrix X pol And spatial feature matrix X spa Inputting the super-pixel characteristics into the network, and learning to a higher level by using the network
Figure BDA0003910897700000123
Wherein D is c The characteristic dimensions of the superpixel representing the network output, and the propagation rule of the superpixel-hypergraph convolutional neural network is as follows:
Figure BDA0003910897700000124
where σ (·) is the ReLU activation function, W is the trainable transfinite weight, H fuse Correlation matrix H by polarization characteristics pol And spatial feature correlation matrix H spa The components are spliced to form the composite material,
Figure BDA0003910897700000125
and
Figure BDA0003910897700000126
respectively representing input characteristics and output characteristics of the l-th layer superpixel-hypergraph convolutional neural network, and initial input characteristics
Figure BDA0003910897700000127
From a polarization characteristic matrix X pol And spatial feature matrix X spa Formed by splicing D v =Σ e∈E W(e)H fuse (v, e) denotes the degree of super-graph edge, D e =Σ v∈V H fuse (v, e) represents the degree of the super-edge node, Θ (l) A filter matrix representing a size trainable;
in specific implementation, the trainable super-edge weight W is 2 Mx 2M, and the trainable filtering matrix theta in the first layer of superpixel-hypergraph convolutional neural network (0) Is set to be 8 multiplied by 32, and a trainable filtering matrix theta in the second layer superpixel-hypergraph convolutional neural network (1) Is set to be 23 multiplied by 64, and the characteristic dimension D of the super pixel output by the two-layer super pixel-super graph convolution neural network c Is 64;
step 402, utilizing a superpixel-to-pixel conversion matrix
Figure BDA0003910897700000131
The calculation formula of the conversion matrix Q is:
Figure BDA0003910897700000132
where Q (I, j) represents the element in row I and column j in the super pixel-pixel conversion matrix Q, and I =1, …, I h ×I w ,j=1,…,M,p i Representing the ith of a polarized SAR imagePixel point, S j Representing the jth superpixel, and then outputting the superpixel-superpixel convolutional neural network
Figure BDA0003910897700000133
Conversion to pixel level features
Figure BDA0003910897700000134
Wherein, I h I w Representing the number of all pixel points of the polarized SAR image;
step five, constructing a feature reconstruction network and a local feature extraction network, and fusing reconstruction features and local features to realize feature enhancement:
step 501, constructing a feature reconstruction network, inputting the obtained pixel-level features into the feature reconstruction network, wherein the feature reconstruction expression is
Figure BDA0003910897700000135
Wherein,
Figure BDA0003910897700000136
represents the output of the l-th hidden layer, an
Figure BDA0003910897700000137
b (l) Representation of the feature reconstruction network bias, f rec () represents an activation function of the feature reconstruction network;
in specific implementation, the feature reconstruction network consists of three layers of fully connected networks, and the output dimensions of the three layers of fully connected networks are respectively 64 and 32,6;
step 502, constructing a local feature extraction network, and extracting local feature X of the polarized SAR image local The local feature extraction expression is
Figure BDA0003910897700000141
Wherein,
Figure BDA0003910897700000142
local features extracted by the layer I network are shown,
Figure BDA0003910897700000143
which represents the kernel of the convolution,
Figure BDA0003910897700000144
representing local feature extraction network bias, f (-) represents an activation function of the feature reconstruction network;
in specific implementation, the local feature extraction network consists of four layers of convolutional neural networks, the sizes of convolution kernels are 1 multiplied by 1,5 multiplied by 5,1 multiplied by 1,5 multiplied by 5 respectively, the output dimensionality of each layer of convolutional neural network is 32, 32, 64 and 64, and the input of the local feature extraction network is an initial whole polarization SAR image;
step 503, reconstructing the output of the network from the features
Figure BDA0003910897700000145
And local feature X local Splicing is carried out, the enhancement of the characteristics is realized, and the overall characteristics X finally used for classification are obtained total
In specific implementation, the first layer of the characteristic reconstruction network is output
Figure BDA0003910897700000146
And local feature X local Spliced to obtain X total
Figure BDA0003910897700000147
Is 64, local feature X local Is 64, resulting in X total Is 128;
step six, training the network by using a training set, and outputting a classification result:
step 601, integrating the overall characteristics X total Inputting the prediction result into a Softmax classifier to obtain a prediction result P of each pixel point in the image jc
Step 602, randomly selecting training samples with the proportion of r =5% from each class to form a training set, training the network, wherein a loss function in the network training process is L = alpha L rec +L c Wherein L is rec Is to reconstructLoss, L c Is the classification loss, α is the equilibrium parameter, set to 0.05, and the reconstruction loss L rec And a classification loss L c The calculation formula of (2) is as follows:
Figure BDA0003910897700000148
Figure BDA0003910897700000149
wherein N represents the number of training samples, C represents the number of classes, Y jc True value, P, of class c jc Representing a predictive label.
The effectiveness of the invention can be further confirmed by the following simulation experiments:
1. experimental conditions and methods
The hardware platform is as follows: inter (R) Core (TM) i5-10600K CPU@4.10GHZ, 16.0GB RAM;
the software platform is as follows: pytrch 1.10;
the experimental method comprises the following steps: respectively, a convolutional neural network, a graph convolutional neural network and the method of the present invention.
2. Simulation content and results
The image of the area of Oberpfaffenhofen, germany, shown in fig. 2 is taken as a test image, and classification simulation is carried out on the image of fig. 2 by respectively using a convolutional neural network, a graph convolutional neural network and the method of the invention, and the classification result is shown in fig. 3. Where fig. 3 (a) is the result of classification using a convolutional neural network, fig. 3 (b) is the result of classification using a graph convolutional neural network, and fig. 3 (c) is the result of classification using the present invention. As can be seen from FIG. 3, compared with the convolutional neural network method, the method of the present invention has many fewer classification noise pixel points in the classification result, and compared with the convolutional neural network method, the classification result is more accurate. Table 1 shows the classification accuracy of the images in the area Oberpfaffenhofen, germany, where OA represents the overall classification accuracy, and it can be seen from table 1 that the method of the present invention can achieve higher classification accuracy than the convolutional neural network and the graph convolutional neural network.
TABLE 1 Oberpfaffenhofen region of Germany image classification results
Figure BDA0003910897700000151
Figure BDA0003910897700000161
The above embodiments are only examples of the present invention, and are not intended to limit the present invention, and all simple modifications, changes and equivalent structural changes made to the above embodiments according to the technical spirit of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (8)

1.一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:包括以下步骤:1. a polarized SAR image classification method based on superpixel-supergraph feature enhancement network, characterized in that: comprise the following steps: 步骤一、预处理极化SAR图像的输入数据:Step 1. Preprocessing the input data of the polarimetric SAR image: 步骤101、将极化SAR的极化散射矩阵
Figure FDA0003910897690000011
在Pauli基下转换得到相干矩阵T,Shh和Svv表示同极化分量,Shv和Svh表示交叉极化分量,h和v分别表示水平极化和垂直极化;
Step 101, the polarization scattering matrix of the polarimetric SAR
Figure FDA0003910897690000011
The coherence matrix T is obtained by converting under the Pauli basis, Sh hh and S vv represent co-polarization components, Sh v and S vh represent cross-polarization components, h and v represent horizontal polarization and vertical polarization, respectively;
步骤102、对相干矩阵T进行转化,得到6维初始极化特征向量,进而获得尺寸为Ih×Iw×6的极化SAR输入数据,其中,Ih和Iw分别表示极化SAR图像的长和宽;Step 102: Transform the coherence matrix T to obtain a 6-dimensional initial polarization eigenvector, and then obtain the polarimetric SAR input data with a size of I h ×I w ×6, where I h and I w respectively represent the polarimetric SAR image length and width; 步骤二、利用超像素分割得到极化SAR超像素集:Step 2. Using superpixel segmentation to obtain the polarimetric SAR superpixel set: 步骤201、利用简单线性迭代聚类的超像素分割算法对极化SAR图像进行分割,分割成M个超像素块;Step 201, using the simple linear iterative clustering superpixel segmentation algorithm to segment the polarimetric SAR image into M superpixel blocks; 步骤202、形成极化SAR超像素集S={S1,…,Si,…,SM},其中Si表示第i个超像素块;Step 202, forming a polarized SAR superpixel set S={S 1 ,...,S i ,...,S M }, where S i represents the ith superpixel block; 步骤三、生成超像素极化特征矩阵和空间特征矩阵,构建极化特征关联矩阵和空间特征关联矩阵:Step 3. Generate superpixel polarization feature matrix and spatial feature matrix, and construct polarization feature correlation matrix and spatial feature correlation matrix: 步骤301、在极化SAR超像素集S中,计算每个超像素内所有像素点极化特征的均值,生成极化特征矩阵
Figure FDA0003910897690000012
计算每个超像素内所有像素点横纵坐标的均值,生成空间特征矩阵
Figure FDA0003910897690000013
其中,
Figure FDA0003910897690000014
Figure FDA0003910897690000015
分别表示超像素Si的极化特征向量和空间特征向量;
Step 301. In the polarized SAR superpixel set S, calculate the mean value of the polarized features of all pixels in each superpixel to generate a polarized feature matrix
Figure FDA0003910897690000012
Calculate the mean value of the horizontal and vertical coordinates of all pixels in each superpixel to generate a spatial feature matrix
Figure FDA0003910897690000013
in,
Figure FDA0003910897690000014
and
Figure FDA0003910897690000015
denote the polarization eigenvector and spatial eigenvector of the superpixel S i , respectively;
步骤302、在超像素集S中,利用极化特征矩阵计算超像素极化特征之间的相似度,对于每个超像素,使用k近邻算法挑选出k个相似度最高的值,并将其余相似度值设置为0,生成极化特征关联矩阵Hpol∈RM×MStep 302. In the superpixel set S, use the polarization feature matrix to calculate the similarity between the superpixel polarization features. For each superpixel, use the k nearest neighbor algorithm to select k values with the highest similarity, and divide the rest The similarity value is set to 0 to generate a polarization feature correlation matrix H pol ∈ R M×M ; 步骤303、在超像素集S中,利用空间特征矩阵计算超像素空间特征之间的相似度,对于每个超像素,找出与其在空间位置中相邻超像素的相似度值,并将其余空间特征相似度值设置为0,生成空间特征关联矩阵Hspa∈RM×MStep 303, in the superpixel set S, use the spatial feature matrix to calculate the similarity between the superpixel spatial features, for each superpixel, find out the similarity value with its adjacent superpixels in the spatial position, and calculate the remaining The spatial feature similarity value is set to 0 to generate a spatial feature correlation matrix H spa ∈ R M×M ; 步骤四、构建两层超像素-超图卷积神经网络,进行超像素级到像素级特征转换:Step 4. Construct a two-layer superpixel-supergraph convolutional neural network to convert superpixel-level to pixel-level features: 步骤401、构建两层超像素-超图卷积神经网络,将极化特征关联矩阵Hpol、空间特征关联矩阵Hspa、极化特征矩阵Xpol和空间特征矩阵Xspa输入到网络中,利用网络学习到更高层次的超像素特征
Figure FDA0003910897690000021
其中,Dc表示网络输出超像素的特征维度;
Step 401, constructing a two-layer superpixel-supergraph convolutional neural network, inputting the polarization feature correlation matrix H pol , the spatial feature correlation matrix H spa , the polarization feature matrix X pol and the spatial feature matrix X spa into the network, using The network learns higher-level superpixel features
Figure FDA0003910897690000021
Among them, D c represents the feature dimension of the network output superpixel;
步骤402、利用超像素-像素转换矩阵
Figure FDA0003910897690000022
将超像素-超图卷积神经网络输出的超像素特征
Figure FDA0003910897690000023
转换到像素级特征
Figure FDA0003910897690000024
其中,IhIw表示极化SAR图像所有像素点个数;
Step 402, using the superpixel-pixel transformation matrix
Figure FDA0003910897690000022
Superpixel features output by superpixel-hypergraph convolutional neural network
Figure FDA0003910897690000023
Convert to pixel-level features
Figure FDA0003910897690000024
Among them, I h I w represents the number of all pixels in the polarimetric SAR image;
步骤五、构建特征重建网络和局部特征提取网络,融合重建特征与局部特征实现特征的增强:Step 5. Build a feature reconstruction network and a local feature extraction network, and integrate the reconstruction features and local features to achieve feature enhancement: 步骤501、构建特征重建网络,将得到的像素级特征输入到特征重建网络中,特征重建表达式为
Figure FDA0003910897690000025
其中,
Figure FDA0003910897690000026
表示第l层隐藏层的输出,且
Figure FDA0003910897690000027
b(l)表示特征重建网络偏置,frec(·)表示特征重建网络的激活函数;
Step 501. Construct a feature reconstruction network, and input the obtained pixel-level features into the feature reconstruction network. The feature reconstruction expression is
Figure FDA0003910897690000025
in,
Figure FDA0003910897690000026
represents the output of the lth hidden layer, and
Figure FDA0003910897690000027
b (l) represents the feature reconstruction network bias, f rec ( ) represents the activation function of the feature reconstruction network;
步骤502、构建局部特征提取网络,提取极化SAR图像的局部特征Xlocal,局部特征提取表达式为
Figure FDA0003910897690000031
其中,
Figure FDA0003910897690000032
表示第l层网络提取的局部特征,
Figure FDA0003910897690000033
表示卷积核,
Figure FDA0003910897690000034
表示局部特征提取网络偏置,f(·)表示特征提取网络的激活函数;
Step 502. Construct a local feature extraction network to extract the local feature X local of the polarimetric SAR image. The local feature extraction expression is
Figure FDA0003910897690000031
in,
Figure FDA0003910897690000032
Represents the local features extracted by the l-layer network,
Figure FDA0003910897690000033
Represents the convolution kernel,
Figure FDA0003910897690000034
Represents the bias of the local feature extraction network, f( ) represents the activation function of the feature extraction network;
步骤503、将特征重建网络的输出
Figure FDA0003910897690000035
与局部特征Xlocal进行拼接,实现特征的增强,得到最终用于分类的整体特征Xtotal
Step 503, the output of the feature reconstruction network
Figure FDA0003910897690000035
Splicing with the local feature X local to achieve feature enhancement, and finally obtain the overall feature X total for classification;
步骤六、利用训练集对网络进行训练,输出分类结果:Step 6. Use the training set to train the network and output the classification results: 步骤601、将整体特征Xtotal输入到Softmax分类器中,得到图像中每个像素点的预测结果PjcStep 601. Input the overall feature X total into the Softmax classifier to obtain the prediction result P jc of each pixel in the image; 步骤602、每一类随机选择比例为r的训练样本形成训练集,对网络进行训练,网络训练过程中的损失函数为L=αLrec+Lc,其中,Lrec是重建损失,Lc是分类损失,α是平衡参数。Step 602, each class randomly selects training samples with a ratio of r to form a training set, and trains the network. The loss function in the network training process is L=αL rec +L c , where L rec is the reconstruction loss, and L c is Classification loss, α is the balance parameter.
2.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤101中极化散射矩阵S在Pauli基下转换得到的相干矩阵T的公式如下所示:2. according to a kind of polarization SAR image classification method based on superpixel-hypergraph feature enhancement network according to claim 1, it is characterized in that: in step 101, the coherence matrix T that polarization scattering matrix S converts under Pauli basis The formula for is as follows:
Figure FDA0003910897690000036
Figure FDA0003910897690000036
3.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤102中由相干矩阵T转化得到的6维初始极化特征向量的计算公式为:3. according to a kind of polarimetric SAR image classification method based on superpixel-hypergraph feature enhancement network according to claim 1, it is characterized in that: in step 102, the 6-dimensional initial polarization eigenvector obtained by conversion of coherence matrix T The calculation formula is: f1=10log10(T11+T22+T33)f 1 =10log 10 (T 11 +T 22 +T 33 ) f2=T22/(T11+T22+T33)f 2 =T 22 /(T 11 +T 22 +T 33 ) f3=T33/(T11+T22+T33)f 3 =T 33 /(T 11 +T 22 +T 33 )
Figure FDA0003910897690000041
Figure FDA0003910897690000041
Figure FDA0003910897690000042
Figure FDA0003910897690000042
Figure FDA0003910897690000043
Figure FDA0003910897690000043
其中,Tij(i=1,2,3;j=1,2,3)表示矩阵T第i行第j列对应的元素,fi(i=1,…,6)表示第i维的极化特征值。Among them, T ij (i=1,2,3; j=1,2,3) represents the element corresponding to the i-th row and j-th column of the matrix T, f i (i=1,...,6) represents the i-th dimension Polarization eigenvalues.
4.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤302中生成极化特征关联矩阵Hpol的计算公式如下:4. according to a kind of polarization SAR image classification method based on superpixel-hypergraph feature enhancement network according to claim 1, it is characterized in that: in step 302, the calculation formula of generating polarization feature correlation matrix H pol is as follows:
Figure FDA0003910897690000044
Figure FDA0003910897690000044
其中,hpol(i,j)表示极化特征关联矩阵Hpol第i行第j列的元素,β表示可调节参量,
Figure FDA0003910897690000045
表示超像素Si的6维极化特征向量,KNN(·)表示极化特征的k近邻。
Among them, h pol (i, j) represents the element of the i-th row and the j-th column of the polarization characteristic correlation matrix H pol , and β represents the adjustable parameter,
Figure FDA0003910897690000045
Denotes the 6-dimensional polarization feature vector of a superpixel S i , and KNN(·) denotes the k-nearest neighbors of the polarization feature.
5.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤303中生成空间特征关联矩阵Hspa的计算公式如下:5. according to a kind of polarization SAR image classification method based on superpixel-hypergraph feature enhancement network according to claim 1, it is characterized in that: in the step 303, the calculation formula of generating spatial feature correlation matrix Hspa is as follows:
Figure FDA0003910897690000051
Figure FDA0003910897690000051
其中,hspa(i,j)表示空间特征关联矩阵Hspa第i行第j列的元素,γ表示可调节参量,
Figure FDA0003910897690000052
表示超像素i的2维空间特征向量,neighbor(·)表示空间相邻。
Among them, h spa (i, j) represents the element of the i-th row and the j-column of the spatial feature correlation matrix H spa , and γ represents the adjustable parameter,
Figure FDA0003910897690000052
Represents the 2-dimensional spatial feature vector of superpixel i, and neighbor( ) represents spatial adjacency.
6.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤401中超像素-超图卷积神经网络的传播规则如下:6. according to a kind of polarization SAR image classification method based on superpixel-supergraph feature enhancement network according to claim 1, it is characterized in that: in step 401, the propagation rule of superpixel-supergraph convolutional neural network is as follows:
Figure FDA0003910897690000053
Figure FDA0003910897690000053
其中,σ(·)是ReLU激活函数,W是可训练的超边权重,Hfuse由极化特征关联矩阵Hpol和空间特征关联矩阵Hspa拼接而成,
Figure FDA0003910897690000054
Figure FDA0003910897690000055
分别表示第l层超像素-超图卷积神经网络的输入特征和输出特征,初始输入特征
Figure FDA0003910897690000056
由极化特征矩阵Xpol和空间特征矩阵Xspa拼接而成,
Figure FDA0003910897690000057
表示超图边度,De=∑v∈VHfuse(v,e)表示超边节点度,Θ(l)表示可训练的滤波矩阵。
Among them, σ( ) is the ReLU activation function, W is the trainable hyperedge weight, H fuse is spliced by the polarization feature correlation matrix H pol and the spatial feature correlation matrix H spa ,
Figure FDA0003910897690000054
and
Figure FDA0003910897690000055
Represent the input features and output features of the l-th layer superpixel-supergraph convolutional neural network, and the initial input features
Figure FDA0003910897690000056
It is spliced by the polarization characteristic matrix X pol and the spatial characteristic matrix X spa ,
Figure FDA0003910897690000057
Indicates hypergraph edge degree, D e =∑ v∈V H fuse (v,e) indicates hyperedge node degree, Θ (l) indicates trainable filter matrix.
7.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤402中超像素-像素转换矩阵
Figure FDA0003910897690000058
的计算公式如下:
7. according to a kind of polarization SAR image classification method based on superpixel-supergraph feature enhancement network according to claim 1, it is characterized in that: superpixel-pixel transformation matrix in step 402
Figure FDA0003910897690000058
The calculation formula is as follows:
Figure FDA0003910897690000059
Figure FDA0003910897690000059
其中,Q(i,j)表示超像素-像素转换矩阵Q中第i行第j列的元素,并且i=1,…,Ih×Iw,j=1,…,M,pi表示极化SAR图像的第i个像素点,Sj表示第j个超像素。Among them, Q(i,j) represents the element of row i and column j in the superpixel-pixel transformation matrix Q, and i=1,...,I h ×I w , j=1,...,M, p i represents The i-th pixel of the polarimetric SAR image, S j represents the j-th superpixel.
8.按照权利要求1所述的一种基于超像素-超图特征增强网络的极化SAR图像分类方法,其特征在于:步骤602中重建损失Lrec和分类损失Lc的计算公式分别如下:8. according to a kind of polarization SAR image classification method based on superpixel-hypergraph feature enhancement network according to claim 1, it is characterized in that: in the step 602, the calculation formulas of reconstruction loss L rec and classification loss L c are respectively as follows:
Figure FDA0003910897690000061
Figure FDA0003910897690000061
Figure FDA0003910897690000062
Figure FDA0003910897690000062
其中,N表示训练样本数,C表示类别数,Yjc表示第c类的真值,Pjc表示预测标签。Among them, N represents the number of training samples, C represents the number of categories, Y jc represents the true value of the c-th class, and P jc represents the predicted label.
CN202211322174.0A 2022-10-27 2022-10-27 A polarimetric SAR image classification method based on superpixel-hypergraph feature enhancement network Active CN115578599B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211322174.0A CN115578599B (en) 2022-10-27 2022-10-27 A polarimetric SAR image classification method based on superpixel-hypergraph feature enhancement network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211322174.0A CN115578599B (en) 2022-10-27 2022-10-27 A polarimetric SAR image classification method based on superpixel-hypergraph feature enhancement network

Publications (2)

Publication Number Publication Date
CN115578599A true CN115578599A (en) 2023-01-06
CN115578599B CN115578599B (en) 2025-07-04

Family

ID=84587433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211322174.0A Active CN115578599B (en) 2022-10-27 2022-10-27 A polarimetric SAR image classification method based on superpixel-hypergraph feature enhancement network

Country Status (1)

Country Link
CN (1) CN115578599B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315381A (en) * 2023-11-30 2023-12-29 昆明理工大学 Hyperspectral image classification method based on second-order biased random walk

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190332942A1 (en) * 2016-12-29 2019-10-31 Zhejiang Gongshang University Method for generating spatial-temporally consistent depth map sequences based on convolution neural networks
CN113486967A (en) * 2021-07-15 2021-10-08 南京中科智慧应急研究院有限公司 SAR image classification algorithm combining graph convolution network and Markov random field
CN114067152A (en) * 2022-01-14 2022-02-18 南湖实验室 Refined flood inundated area extraction method based on satellite-borne SAR image
CN114764884A (en) * 2022-01-04 2022-07-19 西安理工大学 End-to-end polarization SAR image classification method based on superpixel and graph convolution
CN114897878A (en) * 2022-06-08 2022-08-12 合肥工业大学 SAR image change detection method based on graph convolution network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190332942A1 (en) * 2016-12-29 2019-10-31 Zhejiang Gongshang University Method for generating spatial-temporally consistent depth map sequences based on convolution neural networks
CN113486967A (en) * 2021-07-15 2021-10-08 南京中科智慧应急研究院有限公司 SAR image classification algorithm combining graph convolution network and Markov random field
CN114764884A (en) * 2022-01-04 2022-07-19 西安理工大学 End-to-end polarization SAR image classification method based on superpixel and graph convolution
CN114067152A (en) * 2022-01-14 2022-02-18 南湖实验室 Refined flood inundated area extraction method based on satellite-borne SAR image
CN114897878A (en) * 2022-06-08 2022-08-12 合肥工业大学 SAR image change detection method based on graph convolution network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王剑;王英华;刘宏伟;何敬鲁;: "基于深度卷积神经网络的PolSAR图像变化检测方法", 系统工程与电子技术, no. 07, 15 May 2018 (2018-05-15) *
王涛;殷君君;刘希韫;黄晨霞;杨健;: "基于梯度的极化SAR图像超像素分割", 电波科学学报, no. 06, 15 December 2019 (2019-12-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315381A (en) * 2023-11-30 2023-12-29 昆明理工大学 Hyperspectral image classification method based on second-order biased random walk
CN117315381B (en) * 2023-11-30 2024-02-09 昆明理工大学 Hyperspectral image classification method based on second-order biased random walk

Also Published As

Publication number Publication date
CN115578599B (en) 2025-07-04

Similar Documents

Publication Publication Date Title
CN108388927B (en) Small sample polarimetric SAR ground object classification method based on deep convolutional Siamese network
Han et al. A semi-supervised generative framework with deep learning features for high-resolution remote sensing image scene classification
Deng et al. Extreme learning machines: new trends and applications
Chen et al. Subspace clustering using a symmetric low-rank representation
CN114581560B (en) Multi-scale neural network infrared image colorization method based on attention mechanism
CN116258914B (en) Remote Sensing Image Classification Method Based on Machine Learning and Local and Global Feature Fusion
Wu et al. Multiscale CNN with autoencoder regularization joint contextual attention network for SAR image classification
CN111652247A (en) A Dipteran Insect Recognition Method Based on Deep Convolutional Neural Networks
CN112861970B (en) Fine-grained image classification method based on feature fusion
CN116543192B (en) A small sample classification method for remote sensing images based on multi-view feature fusion
Zhai et al. Region-aware quantum network for crowd counting
CN109978071A (en) Hyperspectral image classification method based on data augmentation and Multiple Classifier Fusion
CN113298129A (en) Polarized SAR image classification method based on superpixel and graph convolution network
Verma et al. Wild animal detection from highly cluttered images using deep convolutional neural network
Ren et al. Clustering-oriented multiple convolutional neural networks for single image super-resolution
CN114283326A (en) An underwater target re-identification method combining local perception and high-order feature reconstruction
Dong et al. Scale-recursive network with point supervision for crowd scene analysis
CN106096658A (en) Based on the Aerial Images sorting technique without supervision deep space feature coding
Chen et al. An offset graph U-Net for hyperspectral image classification
CN108985161B (en) A low-rank sparse representation image feature learning method based on Laplace regularization
CN114049500A (en) Image evaluation method and system based on meta-learning reweighted network pseudo-label training
CN110210321A (en) Deficient sample face recognition method based on multi-dimentional scale converting network Yu divided group method
Feng et al. An Insulator defect detection network combining bidirectional feature pyramid network and attention mechanism in unmanned aerial vehicle images
CN117726856A (en) Remote sensing image crop classification method based on convolutional network spatiotemporal fusion
CN111563528A (en) SAR image classification method based on multi-scale feature learning network and bilateral filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant