[go: up one dir, main page]

CN113657329B - Classification recognition method and device for targets and terminal equipment - Google Patents

Classification recognition method and device for targets and terminal equipment

Info

Publication number
CN113657329B
CN113657329B CN202110976784.1A CN202110976784A CN113657329B CN 113657329 B CN113657329 B CN 113657329B CN 202110976784 A CN202110976784 A CN 202110976784A CN 113657329 B CN113657329 B CN 113657329B
Authority
CN
China
Prior art keywords
classification
characteristic value
target
vehicle
classifying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110976784.1A
Other languages
Chinese (zh)
Other versions
CN113657329A (en
Inventor
刘博�
高肖肖
李雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Tianhe Defense Technology Co ltd
Original Assignee
Xi'an Tianhe Defense Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Tianhe Defense Technology Co ltd filed Critical Xi'an Tianhe Defense Technology Co ltd
Priority to CN202110976784.1A priority Critical patent/CN113657329B/en
Publication of CN113657329A publication Critical patent/CN113657329A/en
Application granted granted Critical
Publication of CN113657329B publication Critical patent/CN113657329B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本申请提供了一种目标的分类识别方法、装置及终端设备,适用于数据处理技术领域,该方法包括:从雷达采集的目标数据中提取特征值,所述特征值至少包括第一特征值和第二特征值;按照所述第一特征值对所述目标进行第一次分类;按照所述第二特征值对所述目标进行第二次分类;结合至少两次分类的结果,识别所述目标的类别。本发明通过提取两种以上的特征值,并联合两种以上的特征值进行目标分类识别,可有效地提高目标分类识别的准确率。

The present application provides a method, apparatus, and terminal device for classifying and identifying targets, applicable to the field of data processing technology. The method comprises: extracting characteristic values from target data collected by radar, the characteristic values including at least a first characteristic value and a second characteristic value; performing a first classification of the target according to the first characteristic value; performing a second classification of the target according to the second characteristic value; and combining the results of at least two classifications to identify the category of the target. The present invention can effectively improve the accuracy of target classification and identification by extracting two or more characteristic values and combining the two or more characteristic values for target classification and identification.

Description

Classification recognition method and device for targets and terminal equipment
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a target classification and identification method, a target classification and identification device and terminal equipment.
Background
Radar target recognition is always focused by people as an important index for measuring modern radars, and has made a certain progress, and radar automatic target recognition has become the development direction of future radars. The narrow-band radar has a long detection distance, so that the narrow-band radar is widely applied to ground target identification, but the accuracy of the existing ground target classification identification is relatively low.
Disclosure of Invention
In view of this, the embodiment of the application provides a method, a device and a terminal device for classifying and identifying targets, which are used for improving the accuracy of classifying and identifying targets.
A first aspect of an embodiment of the present application provides a method for classifying and identifying objects, including:
Extracting a characteristic value from target data acquired by a radar, wherein the characteristic value at least comprises a first characteristic value and a second characteristic value;
classifying the targets for the first time according to the first characteristic value;
classifying the targets for the second time according to the second characteristic value;
and combining the results of at least two classifications, and identifying the classification of the target.
In a possible implementation manner of the first aspect, the first characteristic value is a speed characteristic value;
The first classifying the target according to the first characteristic value includes:
When the speed characteristic value is not greater than a first preset value, the target is classified as uncertain for the first time;
combining the results of at least two classifications, identifying the class of the target, including:
when the first classification is the vehicle and the second classification is the vehicle, the classification of the target is identified as the vehicle;
when the first classification is uncertain and the second classification is human, then the category of the target is identified as human.
It should be understood that the first categorization into uncertainty refers to the possibility of either a car or a person.
In a possible implementation manner of the first aspect, when the first classification is uncertain and the second classification is not human, the method further includes:
And when the speed characteristic value and the second characteristic value are both within a further limiting condition, identifying the category of the target as a vehicle, otherwise, identifying the category of the target as unknown.
In a possible implementation manner of the first aspect, the feature value further includes a third feature value, and the method further includes:
When the first classification is a car, the second classification is not a car, or when the first classification is uncertain, the second classification is not a person, the speed characteristic value and the second characteristic value are not both within further limiting conditions, performing third classification on the target according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
And when the first classification is uncertain, the second classification is not a person, the speed characteristic value and the second characteristic value are not in further limiting conditions, if the third classification is a person, the target is identified as a person, otherwise the class of the target is identified as unknown.
In a possible implementation manner of the first aspect, one of the RCS eigenvalue and the doppler spectrum entropy eigenvalue is taken as the second eigenvalue, and the other is taken as the third eigenvalue.
In one possible implementation manner of the first aspect,
Said classifying said target a second time according to said second eigenvalue comprises:
classifying the second characteristic value of the target for the second time through an SVM classifier;
and classifying the target for the third time according to the third characteristic value, including:
and classifying the third characteristic value of the target for the third time through the SVM classifier.
In a possible implementation manner of the first aspect, the feature value further includes a third feature value, and the method further includes:
when the first classification is a car and the second classification is not a car, or when the first classification is uncertain and the second classification is not a person, performing third classification on the target according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
When the first classification is a car and the second classification is not a car, if the third classification is a car, identifying the class of the target as a car, otherwise, identifying the class of the target as unknown; and when the first classification is uncertain and the second classification is not human, if the third classification is human, identifying the target as human, otherwise, identifying the category of the target as unknown.
It should be understood that the embodiment of the application is not only suitable for classifying and identifying people and vehicles, but also can be extended to classifying and identifying animals and vehicles.
A second aspect of an embodiment of the present application provides a classification and identification device for an object, including:
The extraction module is used for extracting characteristic values from target data acquired by the radar, and the characteristic values at least comprise a first characteristic value and a second characteristic value;
The classification recognition module is used for classifying the target for the first time according to the first characteristic value, classifying the target for the second time according to the second characteristic value, and recognizing the class of the target by combining the results of at least two classifications.
A third aspect of an embodiment of the present application provides a terminal device, the terminal device comprising a memory, a processor, the memory having stored thereon a computer program executable on the processor, the processor executing the computer program to perform the steps of the classification and identification method of the object as set forth in any one of the first aspects.
A fourth aspect of an embodiment of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the classification method of the object according to any of the first aspect.
A fifth aspect of an embodiment of the present application provides a computer program product for, when run on a terminal device, causing the terminal device to perform the steps of the classification and identification method of any of the objects of the first aspect above.
Compared with the prior art, the method has the beneficial effects that the target classification recognition is carried out by extracting more than two characteristic values and combining more than two characteristic values, so that the accuracy of the target classification recognition can be effectively improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an implementation of a classification and identification method of an object according to an embodiment of the present application;
FIG. 2 is a schematic diagram of relative RCS features for a vehicle and a person provided by an embodiment of the present application;
FIG. 3.1 is a Doppler spectrum plot of a vehicle provided by an embodiment of the present application;
FIG. 3.2 is a Doppler spectrum plot of a person provided by an embodiment of the present application;
FIG. 3.3 is a graph showing Doppler spectrum entropy values of vehicles and people according to an embodiment of the present application;
FIG. 3.4 is a graph of instantaneous frequency of a vehicle provided by an embodiment of the present application;
FIG. 3.5 is a graph of the instantaneous frequency of a person provided by an embodiment of the present application;
fig. 3.6 is a schematic diagram of an SVM classifier according to an embodiment of the present application;
Fig. 4 is a schematic implementation flow diagram of the classification and identification method of the target in an application scenario according to the embodiment of the present application;
FIG. 5.1 is a schematic diagram of a classification and identification method of targets according to an embodiment of the present application in a distribution of RCS features of vehicles and people as a result of an experiment;
FIG. 5.2 is a graph of relative RCS distribution of vehicles and people at an experimental result for a method for classifying and identifying objects provided by an embodiment of the present application;
Fig. 5.3 is a schematic implementation flow diagram of a classification recognition method of a target in an experiment according to an embodiment of the present application;
FIG. 6.1 is a schematic diagram of a distribution of Doppler spectrum entropy features of vehicles and people in an experimental result according to a classification and identification method of targets provided by an embodiment of the present application;
FIG. 6.2 is a schematic diagram of an implementation flow of the classification and identification method of the object in an experiment according to the embodiment of the present application;
Fig. 7 is a schematic diagram of a terminal device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a classification recognition device for objects according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Fig. 1 shows a flowchart of an implementation of a method for classifying and identifying objects according to an embodiment of the present application, where the method for classifying and identifying objects in fig. 1 includes steps 11 to 14.
Step 11, extracting a characteristic value from target data acquired by a radar, wherein the characteristic value at least comprises a first characteristic value and a second characteristic value;
step 12, classifying the targets for the first time according to the first characteristic value;
Step 13, classifying the targets for the second time according to the second characteristic value;
And 14, combining the results of at least two classifications, and identifying the classification of the target.
In the embodiment of the present application, the extracted feature values may include, but are not limited to, a velocity feature value, an RCS feature value, and a doppler spectrum entropy feature value.
Preferably, the first characteristic value is a speed characteristic value. Because the accuracy of identifying the target class is higher when the ground condition is relatively complex when the speed characteristic value is used as the first determination element.
The ground condition is complex, including constant speed and variable speed moving objects in the ground environment.
Taking classification targets of people and vehicles as an example, according to ordinary careful observation, the movement speed of limbs of people when walking is not uniform, but the speed of limbs is changed periodically in one step. The speed of the vehicle type object in motion is kept basically constant, which is equivalent to uniform motion.
And when the independent SVM (relative RCS) and the independent SVM (spectrum entropy feature) are used for distinguishing the uniform-speed and variable-speed moving objects, the misjudgment rate is larger.
For the above reasons, the accuracy of identifying the target category is higher by using the speed characteristic value as the first determination element.
Preferably, the second eigenvalue is an RCS eigenvalue. Because when the first characteristic value is a speed characteristic value, the accuracy of the target class identified by the second characteristic value for the RCS characteristic value is higher than the accuracy of the target class identified by the second characteristic value for the Doppler spectrum entropy characteristic value.
The third feature value and the third classification will be mentioned later, and for the same reason as described above, when there are three classifications, it is preferable that the first feature value is a velocity feature value, the second feature value is an RCS feature value, and the third feature value is a doppler spectrum entropy feature value.
In the embodiment of the application, the RCS characteristic value refers to a characteristic value based on the RCS, which can be the RCS itself or the characteristic value obtained after processing, the speed characteristic value refers to a characteristic value based on the speed, which can be the speed itself or the characteristic value obtained after processing, and the Doppler spectrum entropy characteristic value refers to a characteristic value based on the Doppler spectrum entropy, which can be the Doppler spectrum entropy itself or the characteristic value obtained after processing.
In the embodiment of the application, the first characteristic value is a speed characteristic value;
The first classifying the target according to the first characteristic value includes:
When the speed characteristic value is not greater than a first preset value, the target is classified as uncertain for the first time;
combining the results of at least two classifications, identifying the class of the target, including:
when the first classification is the vehicle and the second classification is the vehicle, the classification of the target is identified as the vehicle;
when the first classification is uncertain and the second classification is human, then the category of the target is identified as human.
Where the first classification is ambiguous, uncertainty refers to the likelihood of both a car or a person.
By using the speed characteristic value as the first determination element, the accuracy of identifying the target class can be improved. And (3) identifying the final category of the target by comparing the speed characteristic value with a first preset value and combining the result of the second classification.
When the category of the object is recognized by the result of the first and second classification, which is still unable to be combined with the two classification, the category of the object can be recognized by the following scheme:
In the embodiment of the application, when the first classification is uncertain and the second classification is not human, the method further comprises the following steps:
And when the speed characteristic value and the second characteristic value are both within a further limiting condition, identifying the category of the target as a vehicle, otherwise, identifying the category of the target as unknown.
For example, further defined conditions are given in both fig. 5.3 and fig. 6.2. It can be seen in fig. 5.3 that when the speed feature value is not less than 2m/s and the RCS is not less than 224, a further limitation condition may be used, and when both the speed feature value and the second feature value are within the limitation condition, the category of the target is identified as a car, otherwise the category of the identified target is unknown. Similarly, it can be seen in fig. 6.2 that when the entropy is not greater than 0.9 and the velocity is not less than 2m/s, a further constraint may be satisfied, and when the velocity feature value and the second feature value are both within the constraint, the class of the target is identified as a car, otherwise the class of the target is identified as unknown.
In practical application, the further limiting condition can be set according to the specific identified target.
When the category of the target cannot be identified by combining the results of the two classifications after the first and second classifications, the following scheme is also used:
in an embodiment of the present application, the feature value further includes a third feature value, and the method further includes:
When the first classification is a car, the second classification is not a car, or when the first classification is uncertain, the second classification is not a person, the speed characteristic value and the second characteristic value are not both within further limiting conditions, performing third classification on the target according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
And when the first classification is uncertain, the second classification is not a person, the speed characteristic value and the second characteristic value are not in further limiting conditions, if the third classification is a person, the target is identified as a person, otherwise the class of the target is identified as unknown.
In the embodiment of the application, one of the RCS characteristic value and the Doppler spectrum entropy characteristic value is taken as the second characteristic value, and the other is taken as the third characteristic value. Preferably, the second eigenvalue is an RCS eigenvalue, the third eigenvalue is a doppler spectrum entropy eigenvalue, and the recognition accuracy is higher.
In an embodiment of the present application, the classifying the target for the second time according to the second feature value includes:
classifying the second characteristic value of the target for the second time through an SVM classifier;
and classifying the target for the third time according to the third characteristic value, including:
and classifying the third characteristic value of the target for the third time through the SVM classifier.
When the category of the target cannot be identified by combining the results of the two classifications after the first and second classifications, the following scheme is also used:
in an embodiment of the present application, the feature value further includes a third feature value, and the method further includes:
when the first classification is a car and the second classification is not a car, or when the first classification is uncertain and the second classification is not a person, performing third classification on the target according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
When the first classification is a car and the second classification is not a car, if the third classification is a car, identifying the class of the target as a car, otherwise, identifying the class of the target as unknown; and when the first classification is uncertain and the second classification is not human, if the third classification is human, identifying the target as human, otherwise, identifying the category of the target as unknown.
The scheme omits a judging step of judging whether the speed characteristic value and the second characteristic value are in a further limiting condition or not, and the speed characteristic value and the second characteristic value are directly identified through the third classification, so that one step is relatively simplified.
In the embodiment of the application, the classification can be performed according to a certain standard, and the standard can be various, for example, the size and the range of certain parameters are judged, and the classification can also be performed by an SVM classifier. Specifically, in the embodiment of the application, the first classification can be performed by judging whether the speed characteristic value is larger than a first preset value, the second classification can be performed by an SVM classifier, the third classification can be performed by the SVM classifier, and the further classification can be performed by judging whether the speed characteristic value and the second characteristic value are both in a further limiting condition.
Therefore, in the embodiment of the application, at least two characteristic values are extracted, and the classification of the target is identified by respectively classifying at least two times and combining the results of classifying at least two times, so that the accuracy of classification identification can be effectively improved. And a third characteristic value can be added to classify the target for the third time, and the classification of the target is identified through comparison and confirmation again, so that the accuracy of classification and identification can be improved more effectively.
In the embodiment of the present application, for the case of classification and identification of people and vehicles, preferably, the first feature value is a speed feature value, the second feature value is an RCS feature value, and the third feature value is a doppler spectrum entropy feature value.
By taking the speed characteristic value as a classification element of the first classification, the RCS characteristic value or the Doppler spectrum entropy characteristic value can effectively improve the accuracy of classification and identification of the target.
The speed characteristic value is used as a classification element of the first classification, and the RCS characteristic value and the Doppler spectrum entropy characteristic value are combined, so that the accuracy of classification and identification of the target can be improved more effectively.
For better understanding of the solution of the embodiment of the present application, the feature values extracted in step 11 are illustrated as follows:
1. RCS eigenvalues:
the radar cross-sectional area (Radar Cross Section, RCS) is a physical quantity that characterizes the backscattering ability of an object to an incident electromagnetic wave, and different types of objects can be identified using radar cross-sectional area features.
Let the power of radar transmitter be P t, the antenna gain be G, the distance of target to radar be R, the effective receiving area of antenna be A e, the radar cross-sectional area of target be sigma, then the echo power P r received at radar department is:
since the operating environment of the radar is unchanged when data is acquired, i.e., P t, G, and Ae can be regarded as constants, there are:
the radar cross-sectional area σ of the target may be expressed as:
σ=kPrR4 (2.3)
from equation (2.3), the radar cross-sectional area is numerically calculated to be related only to k, P r and R, but since k is a constant, the effect of k is generally not considered in the calculation. Meanwhile, the RCS is proportional to the fourth power of the distance R and the echo power, so that the relative value of the RCS is very large and the fluctuation range is relatively large. To facilitate comparison of the relative RCS values of different targets, we use the logarithm of RCS as the eigenvalue, expressed as follows:
gr=10log10(σ)=10log10(PrR4) (2.4)
The method for calculating the echo power P r is as follows:
since the echo energy of the target may be spread into adjacent distance units, there are two general methods for calculating the echo power, namely, firstly, calculating the echo power by taking the data of the distance unit where the maximum value of the target spectrum is located, and secondly, calculating the echo power by taking the data average value of several adjacent distance units according to the spread condition of the target energy. The first method has the advantage of simple calculation, has the disadvantage of large calculation errors, and the second method has the advantage of being able to calculate the power of the target more accurately, and has the disadvantage of being complex to calculate. Preferably, the echo power is obtained using the first method.
The calculation method of the target distance comprises the following steps:
After the target is detected by the constant false alarm, the distance gate information of the target is obtained, and then the actual distance is calculated according to the size of the distance resolution unit of the narrow-band radar. Assuming that the distance resolution unit of the narrow-band radar is Δr, and the distance gate where the target is located is R d, the actual distance R of the target is:
R=Rd·ΔR (2.5)
After the echo power and distance of the target are obtained, the relative RCS of the target can be calculated according to the formula (2.4).
The results may be extracted based on the target relative RCS of the measured data.
Taking the classification targets of people and vehicles as an example, based on the two external field actual measurement data of a radar of a certain model, the relative RCS characteristics of 126 vehicles and people are extracted respectively, and the result is shown in figure 2.
2. Doppler spectrum entropy eigenvalue
1. Analysis of Doppler spectrum of target
The spectrum analysis of the signal is to transform the signal from the time domain to the frequency domain, and some characteristic information in the signal closely related to the frequency can be found.
The methods of signal spectrum analysis include two types, a parametric model-based method and a non-parametric fourier transform-based method. The parameter method mainly comprises a maximum likelihood method, a linear prediction frequency estimation technology, an AR model and a Prony method and the like. The parameter method has the advantages of higher frequency resolution, capability of distinguishing frequency components which are very close to each other in signal frequency, no signal periodicity assumption and suitability for short data processing, and the defects of difficult model scaling, large calculation amount and the like due to the need of establishing a process model. The non-parametric method based on the Fourier transform has the advantages of high calculation efficiency, easiness in implementation and the like, and has the defects of insufficient frequency resolution and spectrum leakage.
The parametric method based on the model has large calculation amount, and preferably, the embodiment of the application adopts a non-parametric method based on Fourier transformation to analyze the Doppler frequency spectrum of the signal. The non-parametric fourier transform based method has a problem of insufficient frequency resolution, and preferably, when analyzing the doppler structure of the target, the analysis is mainly performed from the overall shape of the doppler spectrum of the target.
The Doppler frequency spectrums of the vehicle and the human targets are analyzed, and the Doppler frequency spectrums of the vehicle are shown in fig. 3.1 and the Doppler frequency spectrum of the human targets is shown in fig. 3.2.
The Doppler spectrum of the vehicle is basically free of side lobes and the waveform is very sharp, and the main lobe of the Doppler spectrum of the human is wider than the main lobe of the Doppler spectrum of the first two targets, and the side lobes have larger fluctuation. Therefore, to extract feature information from the target doppler spectrum, it is necessary to find a mathematical method capable of reflecting the overall shape change of the spectrum waveform, and the amount of computation of this mathematical method cannot be too large, and the dimension of the spectrum waveform can be reduced. In view of the above factors and the physical meaning of entropy, it is preferred that embodiments of the present application employ entropy values to describe the variation of the spectral waveform.
2. Extraction of target spectrum entropy value characteristics
Entropy is a very important concept in information theory, which is used in statistics to describe the degree of irregularity or uncertainty of a system. If a random signal has N possible values, if the probability of occurrence of the random signal is p 1,p2,...,pN, the entropy of the random signal is:
it is apparent from equation (2.6) that entropy is non-negative and reaches a minimum value of zero only when the probability of occurrence of a certain signal is 1.
Let x i (i=1, 2,., M) denote the modulus of each point on the doppler spectrum of the range gate where the target is located, and first normalize x i (i=1, 2,., M) to obtain:
substituting formula (2.7) into formula (2.6) to obtain:
the spectral entropy value of the Doppler spectrum of the target can be obtained by the formula (2.8).
3. Extraction result of spectral entropy value
Taking the classification targets of people and vehicles as an example, based on the two outfield actual measurement data of a radar of a certain model, doppler spectrum entropy values of 109 vehicles and people are respectively extracted, and the result is shown in fig. 3.3. From the figure, the Doppler spectrum entropy characteristics of the vehicle and the human are less in aliasing, and the classification identification of the vehicle and the human can be realized theoretically.
3. Speed characteristic value
The extraction of features based on instantaneous frequency is described here as an example.
For radar echo signals, the instantaneous frequency is:
fi(k)=2vf0/c (2.10)
Wherein f 0 is the carrier frequency of the antenna transmitting signal, v is the radial speed of the target, and c is the speed of light.
As can be seen from equation (2.10), the instantaneous frequency is proportional to the target speed. When the target speed magnitude changes, the magnitude of the instantaneous frequency also changes.
Taking classification targets of people and vehicles as an example, according to ordinary careful observation, the movement speed of limbs of people when walking is not uniform, but the speed of limbs is changed periodically in one step. The speed of the vehicle type object in motion is kept basically constant, which is equivalent to uniform motion. The instantaneous frequency curve of the vehicle is shown in fig. 3.4, and the instantaneous frequency curve of the person is shown in fig. 3.5.
Preferably, the standard deviation of the extracted waveform is used as the characteristic quantity, the error is small, and the classification recognition rate is high. Let x= (X 1,x2,...,xN) be the value of each point on the target instantaneous frequency waveform, the calculation formula of the standard deviation is:
Wherein, the Is the mean value of X.
For better understanding of the scheme of the embodiment of the present application, the classifying operation in the embodiment of the present application is further described as follows:
In the embodiment of the application, the first classification, the second classification and the third classification can all be performed by adopting a method based on a support vector machine (Support Vector Machines, SVM).
The support vector machine (Support Vector Machines, SVM) is a pattern recognition technique based on statistical learning theory and structural risk minimization principle, the main idea is to map the samples of the input space to a high-dimensional feature space through nonlinear transformation, and find the optimal classification plane of the samples in the feature space in a linear separation manner.
The SVM classifier is derived from the optimal classification plane in the linear separable case, the principle of which is shown in fig. 3.6. In the figure, circles (' -and squares (' -and (∈r ')) represent two types of target samples, H is a classification line, and H1 and H2 are straight lines parallel to H and passing through samples closest to H in each type, where d is the euclidean distance between H1 and H2, which is also referred to as a classification interval. In a three-dimensional or more classification space, H is also called a classification plane.
The optimal classification line refers to the classification interval that reaches the maximum on the basis of correctly separating the targets, and the broken line in the figure can correctly classify the targets, but the classification interval is obviously not the minimum, so the classification interval is not the optimal classification line. HI and H2 are able to resolve the target correctly and the classification interval is greatest, so H is the optimal classification line.
Let the equation of the classification surface be wx+w 0 =0, and the linearly separable sample set (x i,yi),i=1,2,...,n,x∈Rd, y ε { -1, +1}, satisfy
yi[wxi+w0]-1≥0,i=1,2,...,n (3.4)
Definition of a known classification interval from the point-to-plane distanceMaximizing spacing is equivalent to makingMinimum, satisfy condition (3.4) and letThe smallest classification plane is the optimal classification plane, and the sample points on the two straight lines of Hl and H2 are called support vectors.
Different inner product functions in the SVM classifier have different algorithms, and three main kernel functions are respectively a polynomial kernel function, a Radial Basis Function (RBF) and a Sigmoid kernel function.
Two methods are available for multiple classes of problems, one-to-one and one-to-many.
The specific flow of the "one-to-one" approach is to construct all possible class 2 SVM classifiers, with the training sample set for each classifier taken from only two corresponding classes, thus requiring the construction of n=k (k-1)/2 class 2 SVM classifiers altogether. When constructing a class 2 SVM classifier between class i and class j, the data in the training set comes from only two corresponding classes, and the points of class i and class j are marked as +l and-1 respectively. During testing, the test data are respectively substituted into the N=k (k-l)/2 classifiers constructed in the previous step to test, the scores of the classes are accumulated, and the class corresponding to the highest score is selected as the class to which the test data belong.
The one-to-many method is to construct k class 2 SVM classifiers according to the number k of classification needed, each class corresponds to one of the class 2 SVM classifiers, wherein the ith class 2 SVM classifier marks samples in the ith class as +1, and all other samples are marked as-1. During testing, decision function values corresponding to the 2 classes of classifiers are calculated for the test data respectively, and the class corresponding to the largest function value is selected as the class to which the test data belongs.
In an embodiment of the application, target data may be acquired by a dual radar microwave traffic detector (Microwave Traffic Detector, MTD), noted as MTD data.
Preferably, the step of extracting the feature value is further comprised of the steps of:
The MTD data is subjected to pre-false alarm processing, the echo data is subjected to two-stage FFT, and the data of some distance units and frequency channels are subjected to brief processing, so that the influence of clutter can be reduced to the greatest extent, then the echo data subjected to the pre-processing based on the two-stage FFT is subjected to constant false alarm processing, whether a target exists or not is judged, and relevant information such as the distance, the frequency spectrum and the like of the target is obtained. The radar echo data contains not only useful target information but also various interference information, and the detection of radar targets is performed in various interference environments. The method obtains higher target detection probability from echo information through constant false alarm processing, and has important significance for extracting and identifying target features.
After the MTD data is subjected to pre-false alarm processing and constant false alarm processing, the influence of clutter can be effectively reduced, and the detection probability of a target is improved, so that the extraction and recognition of characteristic values are facilitated.
In the following, referring to an example of an application scenario, as shown in fig. 4, taking a classification target as an example of a person and a car, an application of the classification recognition method for a target provided by the embodiment of the present application is described as follows:
1) And collecting target data, and recording the target data as MTD data.
2) And performing pre-false alarm processing and constant false alarm processing on the MTD data.
3) Feature extraction is performed by using a relative RCS feature, a speed and Doppler spectrum entropy algorithm, wherein the relative RCS feature is a relative RCS value, a speed-instantaneous speed feature quantity and a Doppler spectrum entropy is a Doppler spectrum entropy value.
4) Different conditions are set through the SVM classifier to classify different features;
the first judging element speed is derived from a default speed demarcation point of the person. V=10m/s is used as a demarcation point, and vehicles are primarily judged to be vehicles when more than 10m/s and people or vehicles when less than 10m/s.
5) And (3) further verifying V & gt 10m/s by using an SVM classifier (RCS characteristic), judging that the vehicle is a vehicle, judging that the vehicle is not the vehicle, verifying the SVM classifier (spectrum entropy characteristic) again, wherein the vehicle is in accordance with the V & gt 10m/s, and the unknown item is in non-accordance with the V & gt 10 m/s.
6) Further verifying V <10m/s by using an SVM classifier (RCS feature), if the person is judged to be the person, if the person is not the person, the vehicle is judged to be the vehicle if V <2m/s and RCS <224 are satisfied. If V <2m/s and RCS <224 are not satisfied, SVM classifier (spectral entropy feature) verification is performed, if the SVM classifier is matched with the RCS, the SVM classifier is a person, and if the RCS classifier is not matched with the RCS, the SVM classifier is an unknown item.
The comprehensive target recognition rate based on the relative RCS and the speed features is 98.96%, the comprehensive target recognition rate based on the Doppler spectrum entropy and the speed features is 97.82%, and the comprehensive target recognition rate based on the relative RCS, the Doppler spectrum entropy and the speed features is 99.62%, so that the accuracy of target classification recognition is obviously improved.
It can be seen that the vehicle and the person can be better identified by adopting the relative RCS characteristic or the Doppler spectrum entropy characteristic, but the identification rate is not very high. The embodiment of the application utilizes algorithms such as relative RCS characteristics, speed, doppler frequency spectrum entropy and the like to extract the characteristics, and the speed is selected as a first judging element for the first time, mainly because the misjudgment rate is larger when an independent SVM (relative RCS) and an independent SVM (frequency spectrum entropy characteristic) are used for judging the uniform speed and variable speed moving objects in the complex ground environment, the speed is adopted as the first judging element. And the second discrimination element is selected from SVM (relative RCS), and the SVM (relative RCS) is mainly based on the relative RCS+speed+accuracy rate result of SVM experiment > spectrum entropy feature+speed+accuracy rate result of SVM experiment.
Special cases are that when a person carries a metal article like a corner reflector, it is detected with a radar of a certain type. When a radar electromagnetic wave of a certain type scans the article, the electromagnetic wave can be refracted and amplified on a metal angle to generate a strong echo signal, and the radar can receive the strong echo signal. The result of this is that the RCS value returned from the target object such as a human being is large and is easily confused with a large target object such as a vehicle, thereby increasing the false positive rate of the target.
Under the condition, the first judging element selects the speed, so that the speed is more accurate, and the accuracy of target identification is improved.
The method is particularly used for accurately detecting the ground moving object by using a plurality of combined characteristics based on relative RCS, doppler spectrum entropy and speed, even a radar based on low resolution, so that equipment replacement cost and project cost are reduced, and the identification accuracy is improved by using a plurality of combined characteristics relative to a single method of relative RCS or Doppler spectrum entropy or speed. The three algorithms are combined with SVM multilevel hierarchical verification, so that misjudgment rate of uniform and variable speed moving objects in a complex ground environment is reduced, and anti-interference capability is improved.
In the embodiment of the application, the first characteristic value, the second characteristic value and the third characteristic value are all from a speed characteristic value, a Doppler spectrum entropy value characteristic value and an RCS characteristic value, and the first characteristic value, the second characteristic value and the third characteristic value are different types of characteristic values. Therefore, it can be set as appropriate in practice. For the case where the target is a car or a person, it is preferable that the first characteristic value is a velocity characteristic value, the second characteristic value is an RCS characteristic value, and the third characteristic value is a doppler spectrum entropy value characteristic value.
In order to demonstrate the effect of the embodiments of the present application, a related experiment was performed, as follows:
1. target classification recognition result based on RCS characteristics and speed
Based on the data acquired by two outfield experiments of a radar of a certain model, the relative RCS characteristics of 126 vehicles and the relative RCS characteristics of 126 persons are extracted in total through the experiments, and the distribution situation of the RCS characteristics of the vehicles and the persons is shown in fig. 5.1. Fig. 5.2 is a relative RCS distribution histogram of a car and a person.
The RCS of the first half and the second half of fig. 5.1 differ significantly, and the preliminary judgment may be the influence of the heavy frequency, the heavy frequency of the first half being 109us, and the heavy frequency of the second half being 108us.
The classification and identification flow is shown in fig. 5.3.
100 Car RCS samples and 100 person RCS samples were randomly taken from the 126 car and person RCS feature samples as training samples, and the remaining 52 samples were used as test samples, and 100 random experiments were performed in total, and the results are shown in Table 4.1. From the table, classification recognition can be better realized on vehicles and people based on the speed and the relative RCS characteristics, and the recognition rate is higher than 90%.
TABLE 4.1 vehicle and person classification results based on relative RCS characteristics
It should be noted that, in total, 100 random experiments were performed, 57 represents the number of occurrences of 100% recognition rate, 32 represents the number of occurrences of 98.08% recognition rate, and 11 represents the number of occurrences of 96.15% recognition rate, and the overall recognition rate is (100%. Times.57+98.08. Times.32+96.15./100=98.96%).
2. Target classification recognition result based on Doppler spectrum entropy characteristics and speed
Based on the data acquired by two outfield experiments of a radar of a certain model, 20 Doppler units in front of and behind the Doppler unit where the target is located are utilized to calculate Doppler spectrum entropy of the target, and the Doppler spectrum entropy of 109 vehicles and the Doppler spectrum entropy of 109 individuals are extracted in total through the experiments, as shown in the distribution situation of Doppler spectrum entropy characteristics of the vehicles and the individuals in FIG. 6.1. From the figure, it can be seen that the spectral entropy features of the person and the vehicle overlap less, and the person and the vehicle can be better distinguished by using the features. The classification and identification flow is shown in fig. 6.2.
90 Samples are randomly taken from 109 Doppler spectrum entropy characteristics of people and vehicles and the rest 38 samples are taken as test samples, 100 random tests are carried out, and the test results are shown in table 4.2:
TABLE 4.2 vehicle and person classification results based on speed and Doppler Spectrum entropy characterization
The calculation method of the parameter meaning and the comprehensive recognition rate in the table refers to remarks in table 4.1.
From the test results, the vehicle can be better distinguished from the person by extracting the speed and Doppler spectrum entropy characteristics of the target.
3. Target classification recognition result based on speed, relative RCS and Doppler spectrum entropy combined characteristics
Based on the data acquired by two outfield experiments of a radar of a certain model, 109 samples of three features of relative RCS features, speed and Doppler spectrum entropy of a person are extracted through the experiments, and 109 samples of three features of relative RCS features, speed and Doppler spectrum entropy of a vehicle are also extracted. When the SVM classifier is adopted for classification, 90 of 109 characteristics of people and vehicles are randomly taken as training samples, and the remaining 38 are taken as test samples. A total of 200 random experiments were performed and the results are shown in table 4.3.
TABLE 4.3 human-vehicle recognition results based on three features of speed, spectral entropy and RCS
Wherein, a total of 200 random experiments are carried out, 171 represents the frequency of occurrence of 100% recognition rate, 29 represents the frequency of occurrence of 97.37% recognition rate, and the other represents the frequency of occurrence of 0% recognition rate, and the comprehensive recognition rate is (100%. 57+98.08+96.15×11)/100=98.96%.
Comparing table 4.3 with table 4.1 and table 4.2, it is known that the overall recognition rate of target recognition by combining the three features is higher than that of the speed plus single feature, and the probability of occurrence of 100% recognition rate is also greatly improved.
The comprehensive target recognition rate based on the two features of the relative RCS and the speed is 98.96%, the comprehensive target recognition rate based on the two features of the Doppler spectrum entropy and the speed is 97.82%, and the comprehensive target recognition rate based on the three combined features of the relative RCS, the Doppler spectrum entropy and the speed is 99.62%, namely, the classification recognition of the vehicle and the person can be well realized by adopting single feature acceleration or combining the two feature accelerations, and the operation amount of recognition by adopting the combined features is higher than the recognition accuracy by adopting the single feature.
Corresponding to the method of the above embodiment, fig. 8 shows a block diagram of the classification recognition device of the object provided by the embodiment of the present application, and for convenience of explanation, only the portion relevant to the embodiment of the present application is shown. The classification recognition device of the object illustrated in fig. 8 may be an execution subject of the classification recognition method of the object provided in the first embodiment.
Referring to fig. 8, the classification recognition method apparatus of the object includes:
An extracting module 81, configured to extract a feature value from target data acquired by the radar, where the feature value includes at least a first feature value and a second feature value;
The classification recognition module 82 is configured to classify the target for the first time according to the first feature value, classify the target for the second time according to the second feature value, and recognize the class of the target by combining the results of at least two classification.
The process of implementing respective functions by each module in the classification and identification device for ground targets provided in the embodiment of the present application may refer to the description in the related embodiments of the foregoing method, and will not be repeated here.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance. It will also be understood that, although the terms "first," "second," etc. may be used herein in some embodiments of the application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first table may be named a second table, and similarly, a second table may be named a first table without departing from the scope of the various described embodiments. The first table and the second table are both tables, but they are not the same table.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The classification and identification method of the ground target provided by the embodiment of the application can be applied to terminal equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented Reality (Augmented Reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-Mobile Personal Computer (UMPC), netbooks, personal digital assistants (Personal DIGITAL ASSISTANT, PDA) and the like, and the embodiment of the application does not limit the specific types of the terminal equipment.
For example, the terminal device may be a Station (STA) in a WLAN, which may be a cellular telephone, a cordless telephone, a Session initiation protocol (Session InitiationProtocol, SIP) telephone, a wireless local loop (Wireless Local Loop, WLL) Station, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, an in-vehicle device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio, a wireless modem card, a television Set Top Box (STB), a customer premise equipment (Customer Premise Equipment, CPE) and/or other devices for communication over a wireless system as well as next generation communication systems, such as a mobile terminal in a 5G network or a mobile terminal in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc.
By way of example, but not limitation, when the terminal device is a wearable device, the wearable device may also be a generic name for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, shoes, and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device comprises full functions, large size, and complete or partial functions which can be realized independent of a smart phone, such as a smart watch or a smart glasses, and is only focused on certain application functions, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets, smart jewelry and the like for physical sign monitoring.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment comprises at least one processor 70 (only one shown in fig. 7), a memory 71, said memory 71 having stored therein a computer program 72 executable on said processor 70. The processor 70, when executing the computer program 72, performs the steps of the classification recognition embodiment of the respective objectives described above, such as steps 11 to 14 shown in fig. 1. Or the processor 70, when executing the computer program 72, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 81-82 shown in fig. 8.
The terminal device 7 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the terminal device 7 and does not constitute a limitation of the terminal device 7, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input transmitting device, a network access device, a bus, etc.
The Processor 70 may be a central processing unit (Central Processing Unit, CPU), or may be another general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), an off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may in some embodiments be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may be an external storage device of the terminal device 7, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs, etc., such as program codes of the computer program. The memory 71 may also be used for temporarily storing data that has been transmitted or is to be transmitted.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The embodiment of the application also provides a terminal device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the terminal device to realize the steps in any of the method embodiments.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product enabling a terminal device to carry out the steps of the method embodiments described above when the computer program product is run on the terminal device.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The foregoing embodiments are merely illustrative of the technical solutions of the present application, and not restrictive, and although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that modifications may still be made to the technical solutions described in the foregoing embodiments or equivalent substitutions of some technical features thereof, and that such modifications or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (7)

1. A method for classifying and identifying objects, comprising:
extracting a characteristic value from target data acquired by a radar, wherein the characteristic value at least comprises a first characteristic value and a second characteristic value, and the first characteristic value is a speed characteristic value;
the target is classified for the first time according to the first characteristic value, wherein the target is classified as a vehicle for the first time when the speed characteristic value is larger than a first preset value, and is classified as uncertain when the speed characteristic value is not larger than the first preset value, and the uncertainty means that the vehicle or the person is possible;
classifying the targets for the second time according to the second characteristic value;
combining the results of at least two classifications, and identifying the classification of the target, wherein the classification comprises the steps that when the speed characteristic value is larger than a first preset value, the target is classified as a vehicle for the first time;
when the first classification is a car, the second classification is not a car, or when the first classification is uncertain, the second classification is not a person, the speed characteristic value and the second characteristic value are not both within further limiting conditions, the target is classified for the third time according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
And when the first classification is uncertain, the second classification is not human, and the speed characteristic value and the second characteristic value are not both in further limiting conditions, if the third classification is human, the target is identified as human.
2. The method of claim 1, wherein when the first classification is uncertain and the second classification is not human, the method further comprises:
And when the speed characteristic value and the second characteristic value are both within a further limiting condition, identifying the category of the target as a vehicle.
3. The method of claim 1, wherein,
Said classifying said target a second time according to said second eigenvalue comprises:
classifying the second characteristic value of the target for the second time through an SVM classifier;
and classifying the target for the third time according to the third characteristic value, including:
and classifying the third characteristic value of the target for the third time through the SVM classifier.
4. The method of claim 1, wherein the characteristic values further comprise a third characteristic value, the method further comprising:
when the first classification is a car and the second classification is not a car, or when the first classification is uncertain and the second classification is not a person, performing third classification on the target according to the third characteristic value;
combining the results of at least two classifications, identifying the class of the target, including:
When the first classification is a car and the second classification is not a car, if the third classification is a car, the class of the target is identified as a car; and when the first classification is uncertain and the second classification is not human, if the third classification is human, identifying the target as human.
5. A classification and identification device for an object, comprising:
the extraction module is used for extracting a characteristic value from target data acquired by the radar, wherein the characteristic value at least comprises a first characteristic value and a second characteristic value, and the first characteristic value is a speed characteristic value;
The classification recognition module is used for classifying the target for the first time according to the first characteristic value, and comprises the steps of classifying the target for the first time as a vehicle when the speed characteristic value is larger than a first preset value, classifying the target for the first time as uncertain when the speed characteristic value is not larger than the first preset value, wherein uncertainty means that the vehicle or a person is possible, classifying the target for the second time according to the second characteristic value, recognizing the class of the target according to the result of at least two classifications, wherein the classification comprises the step of recognizing the class of the target as the vehicle when the first classification is the vehicle and the second classification is the vehicle, the step of recognizing the class of the target is the vehicle when the first classification is the uncertainty and the second classification is the person, the step of classifying the target for the first time is the vehicle when the first classification is the uncertainty and the second classification is the vehicle or the second classification is the uncertainty, the step of classifying the vehicle is the third time when the first classification is the uncertainty and the speed characteristic value and the second characteristic value is the human is not the human, and the RCS characteristic value is the third characteristic value and the RCS characteristic value is the human is the third characteristic value when the third classification is the third classification and the human, and the RCS characteristic value is the third classification is the human, and the human is the third classification is the human.
6. A terminal device, characterized in that it comprises a memory, a processor, on which a computer program is stored which is executable on the processor, the processor executing the computer program to carry out the steps of the method according to any one of claims 1 to 4.
7. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
CN202110976784.1A 2021-08-24 2021-08-24 Classification recognition method and device for targets and terminal equipment Active CN113657329B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110976784.1A CN113657329B (en) 2021-08-24 2021-08-24 Classification recognition method and device for targets and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110976784.1A CN113657329B (en) 2021-08-24 2021-08-24 Classification recognition method and device for targets and terminal equipment

Publications (2)

Publication Number Publication Date
CN113657329A CN113657329A (en) 2021-11-16
CN113657329B true CN113657329B (en) 2025-08-01

Family

ID=78492744

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110976784.1A Active CN113657329B (en) 2021-08-24 2021-08-24 Classification recognition method and device for targets and terminal equipment

Country Status (1)

Country Link
CN (1) CN113657329B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031816A (en) * 2019-03-22 2019-07-19 中国民航科学技术研究院 Based on the Flying Area in Airport noncooperative target classifying identification method for visiting bird radar
CN111523515A (en) * 2020-05-13 2020-08-11 北京百度网讯科技有限公司 Evaluation method, equipment and storage medium for environmental cognitive ability of autonomous vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009047390A1 (en) * 2009-12-02 2011-06-09 Robert Bosch Gmbh Method and control device for determining a direction of movement an object to be moved onto a vehicle
US9664779B2 (en) * 2014-07-03 2017-05-30 GM Global Technology Operations LLC Object classification for vehicle radar systems
DE102017209496B4 (en) * 2017-06-06 2020-10-01 Robert Bosch Gmbh Method and device for classifying an object for a vehicle
CN107886121A (en) * 2017-11-03 2018-04-06 北京清瑞维航技术发展有限公司 Target identification method, apparatus and system based on multiband radar
CN111476099B (en) * 2020-03-09 2024-04-16 深圳市人工智能与机器人研究院 Target detection method, target detection device and terminal equipment
CN111693953B (en) * 2020-05-11 2023-12-05 中山大学 Target classification and recognition model, method, system and device based on micro-Doppler
CN112085063B (en) * 2020-08-10 2023-10-13 深圳市优必选科技股份有限公司 Target identification method, device, terminal equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031816A (en) * 2019-03-22 2019-07-19 中国民航科学技术研究院 Based on the Flying Area in Airport noncooperative target classifying identification method for visiting bird radar
CN111523515A (en) * 2020-05-13 2020-08-11 北京百度网讯科技有限公司 Evaluation method, equipment and storage medium for environmental cognitive ability of autonomous vehicle

Also Published As

Publication number Publication date
CN113657329A (en) 2021-11-16

Similar Documents

Publication Publication Date Title
Amin et al. Hand gesture recognition based on radar micro-Doppler signature envelopes
Lin et al. WiAU: An accurate device-free authentication system with ResNet
Björklund et al. Features for micro‐Doppler based activity classification
Fan et al. TagFree activity identification with RFIDs
CN115205891B (en) Personnel behavior recognition model training method, behavior recognition method and device
CN111914919A (en) Individual identification method of open set radiation source based on deep learning
Huynh-The et al. RF-UAVNet: High-performance convolutional network for RF-based drone surveillance systems
CN106559749B (en) Multi-target passive positioning method based on radio frequency tomography
CN115508821A (en) Multisource fuses unmanned aerial vehicle intelligent detection system
CN115937977B (en) Multi-dimensional feature fusion-based few-sample human body action recognition method
Huang et al. YOLO-ORE: A deep learning-aided object recognition approach for radar systems
Wang et al. A survey of hand gesture recognition based on FMCW radar
CN112198507A (en) Method and device for detecting human body falling features
CN112213697B (en) Feature fusion method for radar deception jamming recognition based on Bayesian decision theory
Xie et al. Lightweight midrange arm-gesture recognition system from mmwave radar point clouds
CN119830149A (en) Radiation source open set identification method and device based on dynamic group constant-change network and generation countermeasure network
Xue et al. Radio frequency identification for drones using spectrogram and CNN
Qiao et al. Person identification with low training sample based on micro-doppler signatures separation
Hayajneh et al. Channel state information based device free wireless sensing for IoT devices employing TinyML
CN112327286B (en) Method, device, equipment and storage medium for classifying daily activities under low complexity
Sim et al. Road environment recognition for automotive FMCW radar systems through convolutional neural network
Alizadeh et al. Characterization and selection of wifi channel state information features for human activity detection in a smart public transportation system
CN113657329B (en) Classification recognition method and device for targets and terminal equipment
Fan et al. Multiple object activity identification using RFIDs: A multipath-aware deep learning solution
Nurhidayat et al. Comparison of Fundamental Radar Features for Differentiating Between Walking and Standing in Horizontal and Vertical Movement Directions.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant