[go: up one dir, main page]

CN107871123B - Inverse synthetic aperture radar space target classification method and system - Google Patents

Inverse synthetic aperture radar space target classification method and system Download PDF

Info

Publication number
CN107871123B
CN107871123B CN201711129995.1A CN201711129995A CN107871123B CN 107871123 B CN107871123 B CN 107871123B CN 201711129995 A CN201711129995 A CN 201711129995A CN 107871123 B CN107871123 B CN 107871123B
Authority
CN
China
Prior art keywords
target
information
determining
distribution
scattering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711129995.1A
Other languages
Chinese (zh)
Other versions
CN107871123A (en
Inventor
李飞
刘丹
余继周
魏耀
王宁
陈成增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pinghu Space Perception Laboratory Technology Co ltd
Beijing Institute of Radio Measurement
Original Assignee
Beijing Institute of Radio Measurement
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Radio Measurement filed Critical Beijing Institute of Radio Measurement
Priority to CN201711129995.1A priority Critical patent/CN107871123B/en
Publication of CN107871123A publication Critical patent/CN107871123A/en
Application granted granted Critical
Publication of CN107871123B publication Critical patent/CN107871123B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention relates to a method and a system for classifying space targets of an inverse synthetic aperture radar, and belongs to the technical field of radars. Wherein, the method comprises the following steps: according to the collected target ISAR image, determining the intensity information and the position information of a target scattering point in the target ISAR image; determining the characteristic information of the target ISAR image according to the position information and a preset distribution rule, wherein the characteristic information comprises: geometrical structure characteristic information and scattering distribution characteristic information; determining a characteristic vector according to the intensity information, the geometric structure characteristic information and the scattering distribution characteristic information; and determining the category of the space target corresponding to the target ISAR image according to the feature vector and a preset classifier. Through the technical scheme provided by the embodiment, on one hand, the technical defects that the classification and identification effects are poor and the limitation is large in the prior art are avoided; on the other hand, the technical effects of accuracy and high efficiency of space target classification are achieved.

Description

Inverse synthetic aperture radar space target classification method and system
Technical Field
The embodiment of the invention relates to the technical field of radar, in particular to a method and a system for classifying space targets of inverse synthetic aperture radar.
Background
With the development of aerospace technology, space-based systems such as satellites and the like play an increasingly important role in national military and security. The development of the united states is highly dependent on the space system, however, the extensive research and development application of the space on-orbit operation technology provides a greater challenge to the space safety belt, and the demand for space target classification (satellite target and space debris target) is more and more urgent for improving space safety and strengthening space target monitoring.
Inverse Synthetic Aperture Radar (ISAR) is used as a high-resolution two-dimensional imaging device, can obtain high-distance resolution by transmitting a large-bandwidth signal, improves the transverse resolution by means of Doppler information generated by relative motion of a target and the Radar, can obtain abundant target structure information all day long and all day long, provides powerful support for Radar target feature extraction and classification and identification, and greatly improves Radar battlefield perception capability.
In the prior art, by regional features: region feature is another representation method of object shape feature besides outline feature and the region feature is related to the whole shape region, including the shape outline and the region contained by the shape outline. The method is based on the identification of the regional characteristics, namely the characteristics of the ISAR image region are directly extracted for classification, wherein the characteristics comprise Fourier transform characteristics, wavelet transform characteristics invariant moment characteristics and the like. Similar to the contour feature, the effectiveness of the region feature depends on the target region segmentation, and the imaging mechanism and the scattering property of the ISAR image cause the blurring of the target region and the defect of the component, thereby influencing the classification and identification effects.
Disclosure of Invention
In order to solve the above technical problems, embodiments of the present invention provide a method and a system for classifying a spatial target of an inverse synthetic aperture radar.
According to a first aspect of the embodiments of the present invention, an embodiment of the present invention provides an inverse synthetic aperture radar spatial target classification method, including:
according to the collected target ISAR image, determining the intensity information and the position information of a target scattering point in the target ISAR image;
determining feature information of the target ISAR image according to the position information and a preset distribution rule, wherein the feature information comprises: geometrical structure characteristic information and scattering distribution characteristic information;
determining a feature vector according to the intensity information, the geometric structure feature information and the scattering distribution feature information;
and determining the category of the space target corresponding to the target ISAR image according to the feature vector and a preset classifier.
The embodiment provides that: according to the technical scheme, the method comprises the steps of determining the characteristic information of an ISAR image according to position information and a distribution rule, determining a characteristic vector according to intensity information and the characteristic information, and determining the classification of a space target corresponding to the ISAR image according to the characteristic vector and a classifier, so that the technical defect that the robustness of target classification is limited due to the fact that factors such as shielding, namely scattering center distribution and the like influence on the identification of the ISAR image when contour feature identification is based in the prior art is avoided; on the other hand, the technical effect of accurately and efficiently classifying the space target is achieved.
Further, the determining the intensity information and the position information of the target scattering point of the target ISAR image according to the collected target ISAR image specifically includes:
determining the noise intensity of an initial target scattering point according to the target ISAR image;
selecting the initial target scattering points with the noise intensity larger than a preset detection threshold from the initial target scattering points as the target scattering points;
and determining the intensity information and the position information according to the target scattering points.
In the embodiment, according to the technical scheme that the scattering points meeting the requirements are selected from the initial target scattering points as the target scattering points according to the noise intensity of the initial target scattering points and the detection threshold, and then the intensity information and the position information are determined according to the target scattering points, the noise scattering points are quickly removed, the technical effect of quickly and accurately determining the target scattering points is achieved, and the technical effect of accurately and efficiently classifying the space targets is further achieved.
Further, the method further comprises:
determining the detection threshold according to formula 1, formula 1:
thr=mean(I)·C
wherein mean () is an averaging operation, I is the target ISAR image, and C is a preset constant greater than 0.
Further, the intensity information is determined according to equation 1-1, equation 1-1:
Figure GDA0002447961860000031
wherein σiThe intensity corresponding to the ith target scatter point.
Further, the determining the feature information according to the location information and a preset distribution rule specifically includes:
determining the primary direction information and the secondary direction information of the target scattering point according to the position information and a PCA algorithm;
determining main direction distribution length information of the main direction information and secondary direction distribution length information of the secondary direction information according to a histogram method;
determining distribution entropy information of the target scattering point in the main direction according to the main direction information and a preset calculation formula, and determining distribution entropy information of the target scattering point in the secondary direction according to the secondary direction information and the calculation formula;
determining the geometrical structure characteristic information according to the main direction distribution length information and the secondary direction distribution length information;
and determining the scattering distribution characteristic information according to the distribution entropy information of the main direction and the distribution entropy information of the secondary direction.
Further, the determining the primary direction information and the secondary direction information of the target scattering point according to the position information and the PCA algorithm specifically includes:
determining a position covariance matrix of the scattering points of the target according to equation 2, equation 2:
Figure GDA0002447961860000032
Figure GDA0002447961860000033
wherein xiRepresents the column vector corresponding to the position information of the ith target scattering point, and mu represents the average column vector corresponding to the position information of all the target scattering points, (. DEG)TRepresenting vector transpose;
performing eigenvalue decomposition on the position covariance matrix according to formula 3 to obtain an eigenvalue and an eigenvector of the position covariance matrix, wherein formula 3:
W=UΛUT
wherein Lambda is a diagonal matrix, and the R-th value on the Lambda diagonal is the R-th characteristic value Lambda of the position covariance matrix SigmaR(ii) a The Rth column of U corresponds to the Rth eigenvalue lambdaRR is a positive integer and R has a value in the range of 1 to 1The dimensions of the location covariance matrix.
Further, the determining, according to a histogram method, the primary direction distribution length information of the primary direction information and the secondary direction distribution length information of the secondary direction information specifically includes:
obtaining the main direction u of the scattering point of the target according to the formula 41Projection of (c), equation 4:
u′1=XT·u1
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'1Carrying out uniform discrete gridding on the continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain the distribution f of the target in the main direction1And ind1mTo satisfy f1mProjection values of scattering points of the object in the principal direction, > c.N, f1mIs the distribution f of scattering points of the object in the main direction1The M-th element, M ═ 1,2, …, M1,M1When the number of grids is dispersed in the main direction, the length L corresponding to the length information of the target scattering point in the main direction is determined according to the formula 51And, formula 5:
L1=max(ind1m)-min(ind1m)
wherein max (-) represents maximum and min (-) represents minimum;
obtaining the target scattering point in the power direction u according to the formula 62The projection of the image onto the image plane is performed,
u′2=XT·u2
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'2When the continuous distribution interval is subjected to uniform discrete gridding, counting the number of target scattering points contained in each grid to obtain the distribution f of the target scattering points in the secondary direction2And ind2nIndicates that f is satisfied2nProjection value of scattering point of object in secondary direction, f2nScattering for the targetDistribution f of points in a secondary direction2N-th element, n-1, 2, …, M2,M2When the grid number is dispersed in the secondary direction, determining the length L corresponding to the length information of the target scattering point in the secondary direction according to the formula 72And, formula 7:
L2=max(ind2n)-min(ind2n)
where max (-) indicates the maximum and min (-) indicates the minimum.
Further, the determining, according to the primary direction information and a preset calculation formula, distribution entropy information of the target scattering point in the primary direction, and determining, according to the secondary direction information and the calculation formula, distribution entropy information of the target scattering point in the secondary direction specifically includes:
determining the distribution entropy E corresponding to the distribution entropy of the target scattering points in the main direction according to the formula 81And, formula 8:
Figure GDA0002447961860000051
Figure GDA0002447961860000052
wherein, K1=L1/Δ,
Figure GDA0002447961860000053
Δ1Distance resolution, Δ, for target scatter point correspondences2For azimuthal resolution of scattering points of the object, by dividing u1Carrying out uniform discrete gridding on a continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain the distribution f 'of the target scattering points in the main direction'1
Determining the distribution entropy E corresponding to the distribution entropy information of the target scattering points in the secondary direction according to the formula 92Formula 9:
Figure GDA0002447961860000054
Figure GDA0002447961860000055
wherein, K2=L2/Δ,
Figure GDA0002447961860000056
K2Is the number of sub-directional discrete grids, Δ1Distance resolution, Δ, for target scatter point correspondences2U 'is taken as azimuth resolution of target scattering point'2Carrying out uniform discrete gridding on the continuous distribution interval, counting the number of target scattering points contained in each grid to obtain the distribution f 'of the target scattering points in the secondary direction'2
Further, before the determining the intensity information and the position information of the target scattering point of the target ISAR image according to the acquired target ISAR image, the method further includes:
determining a set of ISAR image samples from the plurality of ISAR images;
selecting at least one vector in the feature vectors of each ISAR image in the sample set to obtain a support vector and a weight coefficient corresponding to the support vector;
and obtaining the classifier according to the support vector and the weight coefficient.
Further, the determining, according to the feature vector and a preset classifier, a spatial target class corresponding to the target ISAR image specifically includes:
inputting the feature vector into the classifier;
the classifier outputs a spatial target class label corresponding to the target ISAR image according to equation 10, equation 10:
Figure GDA0002447961860000061
wherein | | · | | is norm operation, α is parameter, V' is feature vector, VjFor the jth vector in the feature vectors, ωjThe weight coefficient corresponding to the jth vector;
determining a spatial object class corresponding to the target ISAR image according to formula 11, wherein formula 11:
C=sign(y(V′))
wherein sign (·) is a numerical symbol, when C is 1, the spatial object class is a satellite object, and when C is-1, the spatial object class is a spatial debris.
According to another aspect of the embodiments of the present invention, there is provided an inverse synthetic aperture radar spatial object classification system corresponding to the above method, the system including: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein,
the processor, when executing the program, implements the method of any of the above embodiments.
The method has the advantages that the intensity information and the position information of the target scattering point in the target ISAR image are determined according to the collected target ISAR image; determining the characteristic information of the target ISAR image according to the position information and a preset distribution rule, wherein the characteristic information comprises: geometrical structure characteristic information and scattering distribution characteristic information; determining a characteristic vector according to the intensity information, the geometric structure characteristic information and the scattering distribution characteristic information; according to the technical scheme, the classification of the space target corresponding to the target ISAR image is determined according to the feature vector and the preset classifier, the technical problems that the classification and identification effects are poor and the limitation is large in the prior art are solved, and the technical effects of accuracy and high efficiency in classifying the space target are achieved.
Drawings
Fig. 1 is a schematic flowchart of a method for classifying a spatial target in an inverse synthetic aperture radar according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an inverse synthetic aperture radar space target classification system according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
The embodiment of the invention provides a method and a system for classifying space targets of an inverse synthetic aperture radar.
According to an aspect of the embodiments of the present invention, the embodiments of the present invention provide an inverse synthetic aperture radar space target classification method.
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for classifying a spatial target of an inverse synthetic aperture radar according to an embodiment of the present invention.
As shown in fig. 1, the method includes:
step S100: according to the collected target ISAR image, determining the intensity information and the position information of a target scattering point in the target ISAR image;
step S200: determining the characteristic information of the target ISAR image according to the position information and a preset distribution rule, wherein the characteristic information comprises: geometrical structure characteristic information and scattering distribution characteristic information;
step S300: determining a characteristic vector according to the intensity information geometric structure characteristic information and the scattering distribution characteristic information;
step S400: and determining the category of the space target corresponding to the target ISAR image according to the feature vector and a preset classifier.
The embodiment provides that: according to the technical scheme, the method comprises the steps of determining the characteristic information of an ISAR image according to position information and a distribution rule, determining a characteristic vector according to intensity information and the characteristic information, and determining the classification of a space target corresponding to the ISAR image according to the characteristic vector and a classifier, so that the technical defect that the robustness of target classification is limited due to the fact that factors such as shielding, namely scattering center distribution and the like influence on the identification of the ISAR image when contour feature identification is based in the prior art is avoided; on the other hand, the technical effect of accurately and efficiently classifying the space target is achieved.
In a possible implementation technical solution, step S100 specifically includes:
determining the noise intensity of an initial target scattering point according to the target ISAR image;
selecting initial target scattering points with the noise intensity larger than a preset detection threshold from the initial target scattering points as target scattering points;
and determining intensity information and position information according to the scattering points of the target.
In one possible implementation, the detection threshold is determined according to equation 1, where equation 1:
thr=mean(I)·C
where mean () is the averaging operation, I is the target ISAR image, and C is a preset constant greater than 0.
In one possible implementation, the intensity information is determined according to equation 1-1, equation 1-1:
Figure GDA0002447961860000081
wherein σiThe intensity corresponding to the ith target scatter point.
In a possible implementation technical solution, step S200 specifically includes:
determining main direction information and secondary direction information of the target scattering point according to the position information and a PCA algorithm;
determining main direction distribution length information of the main direction information and secondary direction distribution length information of the secondary direction information according to a histogram method;
determining distribution entropy information of the target scattering point in the main direction according to the main direction information and a preset calculation formula, and determining distribution entropy information of the target scattering point in the secondary direction according to the secondary direction information and a preset calculation formula;
determining geometrical structure characteristic information according to the main direction distribution length information and the secondary direction distribution length information;
and determining scattering distribution characteristic information according to the distribution entropy information of the main direction and the distribution entropy information of the secondary direction.
In a technical solution that may be implemented, determining primary direction information and secondary direction information of a target scattering point according to position information and a PCA algorithm specifically includes:
determining a position covariance matrix of the scattering points of the target according to equation 2, where equation 2:
Figure GDA0002447961860000082
Figure GDA0002447961860000083
wherein xiRepresents the column vector corresponding to the position information of the ith target scattering point, and mu represents the average column vector corresponding to the position information of all the target scattering points, (. DEG)TRepresenting vector transpose;
performing eigenvalue decomposition on the position covariance matrix according to formula 3 to obtain an eigenvalue and an eigenvector of the position covariance matrix, wherein formula 3:
W=UΛUT
wherein Lambda is a diagonal matrix, and the R-th value on the Lambda diagonal is the R-th characteristic value Lambda of the position covariance matrix SigmaRThe Rth column of U corresponds to the Rth eigenvalue lambdaRR is a positive integer and the range of values of R is from 1 to the dimension of the position covariance matrix.
In a possible implementation technical solution, determining the primary direction distribution length information of the primary direction information and the secondary direction distribution length information of the secondary direction information according to a histogram method specifically includes:
obtaining the main direction u of the scattering point of the target according to equation 41Projection of (c), equation 4:
u′1=XT·u1
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'1Carrying out uniform discrete gridding in continuous distribution interval, and counting the meshes contained in each gridMarking the number of scattering points to obtain the distribution f of the target in the main direction1And ind1mTo satisfy f1mProjection values of scattering points of the object in the principal direction, > c.N, f1mIs the distribution f of scattering points of the object in the main direction1The M-th element, M ═ 1,2, …, M1,M1When the number of grids is dispersed in the main direction, the length L corresponding to the length information of the target scattering point in the main direction is determined according to equation 51And, formula 5:
L1=max(ind1m)-min(ind1m)
wherein max (-) represents the maximum value, min (-) represents the minimum value;
obtaining the target scattering point in the second direction u according to equation 62The projection of the image onto the image plane is performed,
u′2=XT·u2
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'2When the continuous distribution interval is subjected to uniform discrete gridding, counting the number of target scattering points contained in each grid to obtain the distribution f of the target scattering points in the secondary direction2And ind2nIndicates that f is satisfied2nProjection value of scattering point of object in secondary direction, f2nIs the distribution f of scattering points of the object in the secondary direction2N-th element, n-1, 2, …, M2,M2When the grid number is dispersed in the secondary direction, the length L corresponding to the length information of the target scattering point in the secondary direction is determined according to the formula 72And, formula 7:
L2=max(ind2n)-min(ind2n)
where max (-) indicates the maximum and min (-) indicates the minimum.
In this embodiment, M2The value is 32.
In a possible implementation technical solution, determining distribution entropy information of the target scattering point in the primary direction according to the primary direction information and a preset calculation formula, and determining distribution entropy information of the target scattering point in the secondary direction according to the secondary direction information and the calculation formula specifically includes:
determining distribution entropy E corresponding to distribution entropy of target scattering points in main direction according to formula 81And, formula 8:
Figure GDA0002447961860000101
Figure GDA0002447961860000102
wherein, K1=L1/Δ,
Figure GDA0002447961860000103
Δ1Distance resolution, Δ, for target scatter point correspondences2U 'is taken as azimuth resolution of target scattering point'1Carrying out uniform discrete gridding on the continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain the distribution f 'of the target scattering points in the main direction'1
Determining distribution entropy E corresponding to distribution entropy information of target scattering points in the secondary direction according to formula 92Formula 9:
Figure GDA0002447961860000104
Figure GDA0002447961860000105
wherein, K2=L2/Δ,
Figure GDA0002447961860000106
K2Is the number of sub-directional discrete grids, Δ1Distance resolution, Δ, for target scatter point correspondences2U 'is taken as azimuth resolution of target scattering point'2Carrying out uniform discrete gridding on the continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain target scatteringDistribution f 'of shooting points in secondary direction'2
In the present embodiment, for example: estimating the primary direction and the secondary direction of the target distribution by using a Principal Component Analysis (PCA) method, and estimating the eigenvalue, the distribution length and the distribution entropy of the scattering points of the target in the primary and secondary directions, which are respectively expressed as lambda12,L1,L2,E1,E2. Calculating the product and ratio of the target characteristic values, respectively expressed as S1=λ1·λ2,R1=λ12(ii) a Target distribution area S2=L1·L2Target major-minor direction distribution length ratio R2=L1/L2The distribution entropy product and the ratio of the scattering points in the primary direction and the secondary direction of the target are respectively expressed as: s3=E1·E2,R3=E1/E2
In a possible implementation solution, before step S100, the method further includes:
determining a set of ISAR image samples from the plurality of ISAR images;
selecting at least one vector in the feature vectors of each ISAR image in the sample set to obtain a support vector and a weight coefficient corresponding to the support vector;
and obtaining a classifier according to the support vector and the weight coefficient.
In this embodiment, it can be understood that the classifier includes: linear decision analysis, SVM, correlation vector machine and k-nearest neighbor classifier. In this embodiment, an SVM classifier is taken as an example to describe a training process of the classifier.
In this embodiment, a feature vector consisting of 13 features is now extracted and input to the SVM classifier. In particular, the amount of the solvent to be used,
Figure GDA0002447961860000111
inputting the vector data into an SVM classifier to obtain a group of feature vectors { V } as support vectors1,V2,...,VQAnd corresponding weight coefficients ω12,...,ωQIn which VjFor the jth person to be trainedFeatures refined as support vectors, omegajAnd j is the corresponding jth weight coefficient, wherein j is 1, 2.
In a possible implementation technical solution, step S400 specifically includes:
inputting a feature vector into the classifier;
the classifier outputs a spatial target class label corresponding to the target ISAR image according to equation 10, equation 10:
Figure GDA0002447961860000112
wherein | | · | | is norm operation, α is parameter, V' is feature vector, VjFor the jth vector in the feature vectors, ωjThe weight coefficient corresponding to the jth vector;
determining the spatial object class according to equation 11, equation 11:
C=sign(y(V′))
wherein sign (·) is a numerical sign, when C is 1, the spatial object class is a satellite object, and when C is-1, the spatial object class is a spatial patch.
According to another aspect of the embodiments of the present invention, there is provided an inverse synthetic aperture radar spatial object classification system corresponding to the above method.
Referring to fig. 2, fig. 2 is a schematic flow chart of an inverse synthetic aperture radar space target classification system according to an embodiment of the present invention.
As shown in fig. 2, the system includes: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein,
the processor executes the program to implement the method of any of the above embodiments.
According to the technical scheme provided by the embodiment, the PCA algorithm is utilized to estimate the primary and secondary directions with the largest distribution variance of the target scattering points, the geometric structural characteristics and the scattering characteristics of the target are estimated in the primary and secondary directions of the target scattering points, the intrinsic characteristics of the target structure and the scattering can be better reflected, the robustness of the classification algorithm on the sensitivity of the target contour and the shape posture is enhanced, and the identification performance of the algorithm is improved.
The effect of the present invention is further illustrated by the following experiment on measured data:
1. an experimental scene is as follows:
the measured data comprises five fragmented targets and eight satellite targets, the ISAR image of the satellite target comprises an image of the satellite solar sailboard observed and an image of the satellite solar sailboard not observed, and the ISAR image of each target at least comprises two circles of images (target ISAR images obtained at different times).
2. The experimental contents are as follows:
and carrying out scattering point detection on the target ISAR image to obtain the intensity information and the position information of the target scattering points, then calculating the primary and secondary directions of the target by using PCA, and extracting the geometric structure characteristics and the scattering distribution characteristics of the target according to the primary and secondary directions. Selecting a feature vector of training data to train an SVM classifier to obtain SVM classifier parameters and support vectors, inputting test data into the trained SVM classifier, and obtaining target class information according to the classifier output.
The reader should understand that in the description of this specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should also be understood that, in the embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. An inverse synthetic aperture radar spatial target classification method, the method comprising:
according to the collected target ISAR image, determining the intensity information and the position information of a target scattering point in the target ISAR image;
determining feature information of the target ISAR image according to the position information and a preset distribution rule, wherein the feature information comprises: geometrical structure characteristic information and scattering distribution characteristic information;
determining a feature vector according to the intensity information, the geometric structure feature information and the scattering distribution feature information;
determining the category of a space target corresponding to the target ISAR image according to the feature vector and a preset classifier;
the determining, according to the location information and a preset distribution rule, feature information of the target ISAR image specifically includes:
determining the primary direction information and the secondary direction information of the target scattering point according to the position information and a PCA algorithm;
determining main direction distribution length information of the main direction information and secondary direction distribution length information of the secondary direction information according to a histogram method;
determining distribution entropy information of the target scattering point in the main direction according to the main direction information and a preset calculation formula, and determining distribution entropy information of the target scattering point in the secondary direction according to the secondary direction information and the calculation formula;
determining the geometrical structure characteristic information according to the main direction distribution length information and the secondary direction distribution length information;
and determining the scattering distribution characteristic information according to the distribution entropy information of the main direction and the distribution entropy information of the secondary direction.
2. The method according to claim 1, wherein the determining the intensity information and the position information of the target scattering point of the target ISAR image according to the acquired target ISAR image specifically comprises:
determining the noise intensity of an initial target scattering point according to the target ISAR image;
selecting the initial target scattering points with the noise intensity larger than a preset detection threshold from the initial target scattering points as the target scattering points;
and determining the intensity information and the position information according to the target scattering points.
3. The inverse synthetic aperture radar spatial target classification method of claim 2, the method further comprising:
determining the detection threshold according to formula 1, formula 1:
thr=mean(I)·C
wherein mean () is an averaging operation, I is the target ISAR image, and C is a preset constant greater than 0.
4. The method according to claim 1, wherein the determining the primary direction information and the secondary direction information of the scattering point of the target according to the position information and a PCA algorithm specifically comprises:
determining a position covariance matrix of the scattering points of the target according to equation 2, equation 2:
Figure FDA0002447961850000021
Figure FDA0002447961850000022
wherein xiRepresents the column vector corresponding to the position information of the ith target scattering point, and mu represents the average column vector corresponding to the position information of all the target scattering points, (. DEG)TRepresenting vector transpose;
performing eigenvalue decomposition on the position covariance matrix according to formula 3 to obtain an eigenvalue and an eigenvector of the position covariance matrix, wherein formula 3:
W=UΛUT
wherein Lambda is a diagonal matrix, and the R-th value on the Lambda diagonal is the R-th characteristic value Lambda of the position covariance matrix SigmaRThe Rth column of U corresponds to the Rth eigenvalue lambdaRR is a positive integer and the range of values of R is from 1 to the dimension of the position covariance matrix.
5. The method according to claim 4, wherein the determining the primary direction distribution length information of the primary direction information and the secondary direction distribution length information of the secondary direction information according to a histogram method specifically includes:
obtaining the main direction u of the scattering point of the target according to the formula 41Projection of (c), equation 4:
u′1=XT·u1
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'1Carrying out uniform discrete gridding on the continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain the distribution f of the target in the main direction1And ind1mTo satisfy f1mProjection values of scattering points of the object in the principal direction, > c.N, f1mIs the distribution f of scattering points of the object in the main direction1The M-th element, M ═ 1,2, …, M1,M1When the number of grids is dispersed in the main direction, the length L corresponding to the length information of the target scattering point in the main direction is determined according to the formula 51And, formula 5:
L1=max(ind1m)-min(ind1m)
wherein max (-) represents the maximum value, min (-) represents the minimum value;
obtaining the target scattering point in the power direction u according to the formula 62The projection of the image onto the image plane is performed,
u′2=XT·u2
wherein X is a position matrix of the scattering points of the target, and the position covariance matrix is a function of the position matrix;
when passing through'2When the continuous distribution interval is subjected to uniform discrete gridding, counting the number of target scattering points contained in each grid to obtain the distribution f of the target scattering points in the secondary direction2And ind2nIndicates that f is satisfied2nProjection value of scattering point of object in secondary direction, f2nIs the distribution f of scattering points of the object in the secondary direction2N-th element, n-1, 2, …, M2,M2When the grid number is dispersed in the secondary direction, determining the length L corresponding to the length information of the target scattering point in the secondary direction according to the formula 72And, formula 7:
L2=max(ind2n)-min(ind2n)
where max (-) indicates the maximum and min (-) indicates the minimum.
6. The method according to claim 5, wherein the determining distribution entropy information of the scattering points of the target in the primary direction according to the primary direction information and a preset calculation formula, and determining distribution entropy information of the scattering points of the target in the secondary direction according to the secondary direction information and the calculation formula specifically includes:
determining the distribution entropy E corresponding to the distribution entropy information of the target scattering points in the main direction according to the formula 81And, formula 8:
Figure FDA0002447961850000041
Figure FDA0002447961850000042
wherein, K1=L1/Δ,
Figure FDA0002447961850000043
Δ1Distance resolution, Δ, for target scatter point correspondences2U 'is taken as azimuth resolution of target scattering point'1Carrying out uniform discrete gridding on the continuous distribution interval, and counting the number of target scattering points contained in each grid to obtain the distribution f of the target scattering points in the main direction1′;
Determining the distribution entropy E corresponding to the distribution entropy information of the target scattering points in the secondary direction according to the formula 92Formula 9:
Figure FDA0002447961850000044
Figure FDA0002447961850000045
wherein, K2=L2/Δ,
Figure FDA0002447961850000046
K2Is the number of sub-directional discrete grids, Δ1Distance resolution, Δ, for target scatter point correspondences2U 'is taken as azimuth resolution of target scattering point'2Carrying out uniform discrete gridding on the continuous distribution interval, and counting the target powder contained in each gridThe number of jet points obtains the distribution f 'of target scattering points in the secondary direction'2
7. The inverse synthetic aperture radar spatial target classification method of any one of claims 1-6, wherein before the determining the intensity information and the position information of the target scattering points of the target ISAR image from the acquired target ISAR image, the method further comprises:
determining a set of ISAR image samples from the plurality of ISAR images;
selecting at least one vector in the feature vectors of each ISAR image in the sample set to obtain a support vector and a weight coefficient corresponding to the support vector;
and obtaining the classifier according to the support vector and the weight coefficient.
8. The method according to any one of claims 1 to 6, wherein the determining a spatial target class corresponding to the target ISAR image according to the feature vector and a preset classifier specifically includes:
inputting the feature vector into the classifier;
the classifier outputs a spatial target class label corresponding to the target ISAR image according to equation 10, equation 10:
Figure FDA0002447961850000051
wherein | | · | | is norm operation, α is parameter, V' is feature vector, VjFor the jth vector in the feature vectors, ωjThe weight coefficient corresponding to the jth vector;
determining the spatial object class according to equation 11, equation 11:
C=sign(y(V′))
wherein sign (·) is a numerical symbol, when C is 1, the spatial object class is a satellite object, and when C is-1, the spatial object class is a spatial debris.
9. An inverse synthetic aperture radar spatial target classification system, the system comprising: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein,
the processor, when executing the program, implements the method of any of claims 1-8.
CN201711129995.1A 2017-11-15 2017-11-15 Inverse synthetic aperture radar space target classification method and system Active CN107871123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711129995.1A CN107871123B (en) 2017-11-15 2017-11-15 Inverse synthetic aperture radar space target classification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711129995.1A CN107871123B (en) 2017-11-15 2017-11-15 Inverse synthetic aperture radar space target classification method and system

Publications (2)

Publication Number Publication Date
CN107871123A CN107871123A (en) 2018-04-03
CN107871123B true CN107871123B (en) 2020-06-05

Family

ID=61754038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711129995.1A Active CN107871123B (en) 2017-11-15 2017-11-15 Inverse synthetic aperture radar space target classification method and system

Country Status (1)

Country Link
CN (1) CN107871123B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112069651B (en) * 2020-07-23 2024-04-09 西安空间无线电技术研究所 A method for estimating the rotation axis of a spin-stabilized target based on ISAR imaging
CN112949555B (en) * 2021-03-17 2023-03-24 西安电子科技大学 Spatial target ISAR image classification method based on target prior information

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5057843A (en) * 1990-06-25 1991-10-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method for providing a polarization filter for processing synthetic aperture radar image data
US7439906B1 (en) * 2007-01-25 2008-10-21 The United States Of America As Represented By The Secretary Of The Navy Doppler-compensated radar pulse compression processing system and method
EP2138956A1 (en) * 2008-06-23 2009-12-30 Raytheon Company Adaptive match metric selection for automatic target recognition
CN104331707A (en) * 2014-11-02 2015-02-04 西安电子科技大学 Polarized SAR (synthetic aperture radar) image classification method based on depth PCA (principal component analysis) network and SVM (support vector machine)
CN105930846A (en) * 2016-04-05 2016-09-07 西安电子科技大学 Neighborhood information and SVGDL (support vector guide dictionary learning)-based polarimetric SAR image classification method
CN106597400A (en) * 2016-11-15 2017-04-26 北京无线电测量研究所 Ground moving vehicle target classification and recognition method and system based on high-resolution distance image
CN106874889A (en) * 2017-03-14 2017-06-20 西安电子科技大学 Multiple features fusion SAR target discrimination methods based on convolutional neural networks
CN106919919A (en) * 2017-02-28 2017-07-04 西安电子科技大学 A kind of SAR target discrimination methods based on multiple features fusion word bag model
CN107330457A (en) * 2017-06-23 2017-11-07 电子科技大学 A kind of Classification of Polarimetric SAR Image method based on multi-feature fusion

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6646593B1 (en) * 2002-01-08 2003-11-11 Science Applications International Corporation Process for mapping multiple-bounce ghosting artifacts from radar imaging data
US20150316646A1 (en) * 2014-05-01 2015-11-05 Utah State University Synthetic aperture radar target modeling

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5057843A (en) * 1990-06-25 1991-10-15 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method for providing a polarization filter for processing synthetic aperture radar image data
US7439906B1 (en) * 2007-01-25 2008-10-21 The United States Of America As Represented By The Secretary Of The Navy Doppler-compensated radar pulse compression processing system and method
EP2138956A1 (en) * 2008-06-23 2009-12-30 Raytheon Company Adaptive match metric selection for automatic target recognition
CN104331707A (en) * 2014-11-02 2015-02-04 西安电子科技大学 Polarized SAR (synthetic aperture radar) image classification method based on depth PCA (principal component analysis) network and SVM (support vector machine)
CN105930846A (en) * 2016-04-05 2016-09-07 西安电子科技大学 Neighborhood information and SVGDL (support vector guide dictionary learning)-based polarimetric SAR image classification method
CN106597400A (en) * 2016-11-15 2017-04-26 北京无线电测量研究所 Ground moving vehicle target classification and recognition method and system based on high-resolution distance image
CN106919919A (en) * 2017-02-28 2017-07-04 西安电子科技大学 A kind of SAR target discrimination methods based on multiple features fusion word bag model
CN106874889A (en) * 2017-03-14 2017-06-20 西安电子科技大学 Multiple features fusion SAR target discrimination methods based on convolutional neural networks
CN107330457A (en) * 2017-06-23 2017-11-07 电子科技大学 A kind of Classification of Polarimetric SAR Image method based on multi-feature fusion

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
Classification of objects in ISAR imagery using artificial neural networks;Thomas Fechner 等;《Applications and Science of Artificial Neural Networks》;19960322;第339-345页 *
Imaging of multitargets with ISAR based on the time-frequency distribution;A.Wang 等;《Proceedings of ICASSP"94》;20020806;第173-176页 *
ISAR图像的特征提取及应用研究;许志伟;《中国优秀硕士学位论文全文数据库》;20160315;第2016年卷(第03期);第I136-2542页 *
Radar HRRP Target Recognition based on K-SVD Algorithm;Bo Feng 等;《Proceedings of 2011 IEEE CIE International Conference on Radar》;20120301;第642-645页 *
SAR图像船只分类识别研究进展;吴樊 等;《遥感技术与应用》;20140228;第29卷(第1期);参见第1-4页 *
一种SAR目标属性特征提取算法;李飞 等;《西安电子科技大学学报》;20150630;第42卷(第3期);第15-21、89页 *
基于ISAR图像的船舰目标特征提取方法研究;海宏璋;《中国优秀硕士学位论文全文数据库 工程科技II辑》;20160315;第2016年卷(第3期);第C032-17页 *
基于超分辨ISAR成像的飞机目标SVM分类算法;王凤朝 等;《空军工程大学学报》;20090630;第10卷(第3期);参见第21-25页 *
基于超分辨率ISAR成像的空中目标自动识别;许人灿 等;《系统工程与电子技术》;20060131;第28卷(第1期);第46-48页 *
空间目标特征提取及识别技术;马静;《中国优秀硕士学位论文全文数据库 信息科技辑》;20100115;第2011年卷(第1期);第I136-506页 *
雷达图像目标特征提取方法研究;李飞;《中国优秀博士学位论文全文数据库 信息科技辑》;20141015;第2014年卷(第10期);第I136-143页 *

Also Published As

Publication number Publication date
CN107871123A (en) 2018-04-03

Similar Documents

Publication Publication Date Title
Cui et al. Image data augmentation for SAR sensor via generative adversarial nets
Clemente et al. Automatic target recognition of military vehicles with Krawtchouk moments
Wang et al. Application of deep-learning algorithms to MSTAR data
CN107895139B (en) SAR image target identification method based on multi-feature fusion
CN105913074B (en) Based on amplitude and the united SAR image moving-target clustering method of radial velocity
KR101891410B1 (en) Apparatus for classifying target of inverse synthetic aperture radar image using trace transform and method thereof
Khan et al. A customized Gabor filter for unsupervised color image segmentation
Song et al. An effective image reconstruction enhancement method with convolutional reweighting for near-field SAR
Fein-Ashley et al. Benchmarking deep learning classifiers for SAR automatic target recognition
CN107871123B (en) Inverse synthetic aperture radar space target classification method and system
Zhang et al. Improved SLIC superpixel generation algorithm and its application in polarimetric SAR images classification
CN109658340B (en) Fast denoising method of SAR image based on RSVD and histogram preservation
Wang et al. Attributed scattering center guided network based on omnidirectional sub-aperture division for SAR target detection
Song et al. Shape-robust SAR ship detection via context-preserving augmentation and deep contrastive RoI learning
Liu et al. Target detection in remote sensing image based on saliency computation of spiking neural network
CN119206530B (en) Dynamic target identification method, device, equipment and medium for remote sensing image
Macumber et al. Hierarchical closely spaced object (CSO) resolution for IR sensor surveillance
Yin et al. Coarse-to-fine ship detection using visual saliency fusion and feature encoding for optical satellite images
Mishra et al. Utilizing Super-Resolution for Enhanced Automotive Radar Object Detection
Mutreja et al. Comparative Assessment of Different Deep Learning Models for Aircraft Detection
El Hasnaouy et al. Comparison of Feature Extraction Methods for Automated Target Recognition by Reducing Speckle Noise in SAR Data
Gao et al. Hierarchical Feature‐Based Detection Method for SAR Targets Under Complex Environment
Hu et al. Multi-view SAR Target Recognition Using Bidirectional Conv-LSTM Network
Namuduri et al. Image metrics for clutter characterization
Manoharan et al. A Novel Framework for Classifying Remote Sensing Images using Convolutional Neural Networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230616

Address after: 59F, 50 Yongding Road, Haidian District, Beijing 100854

Patentee after: BEIJING INSTITUTE OF RADIO MEASUREMENT

Patentee after: Pinghu Space Perception Laboratory Technology Co.,Ltd.

Address before: 100854 32nd floor, 50 Yongding Road, Haidian District, Beijing

Patentee before: BEIJING INSTITUTE OF RADIO MEASUREMENT

TR01 Transfer of patent right