[go: up one dir, main page]

CN115239770A - A Kernel Correlation Filtering Target Tracking Method and System for Severely Occluded Scenes - Google Patents

A Kernel Correlation Filtering Target Tracking Method and System for Severely Occluded Scenes Download PDF

Info

Publication number
CN115239770A
CN115239770A CN202210893435.8A CN202210893435A CN115239770A CN 115239770 A CN115239770 A CN 115239770A CN 202210893435 A CN202210893435 A CN 202210893435A CN 115239770 A CN115239770 A CN 115239770A
Authority
CN
China
Prior art keywords
target
frame
hsv
tracking
lbp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210893435.8A
Other languages
Chinese (zh)
Other versions
CN115239770B (en
Inventor
贾刚勇
文子强
饶欢乐
陈宇星
徐宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202210893435.8A priority Critical patent/CN115239770B/en
Publication of CN115239770A publication Critical patent/CN115239770A/en
Application granted granted Critical
Publication of CN115239770B publication Critical patent/CN115239770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a nuclear correlation filtering target tracking method and a nuclear correlation filtering target tracking system for a severely occluded scene; the method comprises the following steps: step 1, initializing a KCF algorithm of a KCF filter and a Kalman filtering algorithm of the Kalman filter, and taking a target position of a first frame as an initial position of the KCF algorithm and the Kalman filtering algorithm; step 2, judging whether the target is seriously shielded in the current image by using a shielding detection mechanism; if not, adopting a tracking result of a KCF algorithm as the position of the target in the current image; if so, adopting a tracking result of a Kalman filtering algorithm as the position of the target in the current image; step 3, respectively carrying out self-adaptive updating on the Kalman filter and the KCF filter according to respective tracking results; step 4, reading the next video frame, and returning to execute the step 2; until the tracking process is finished. The invention can still realize the tracking of the target when the target is seriously shielded.

Description

一种面向严重遮挡场景的核相关滤波目标跟踪方法及系统A Kernel Correlation Filtering Target Tracking Method and System for Severely Occluded Scenes

技术领域technical field

本发明属于目标追踪技术领域,具体涉及一种面向严重遮挡场景的核相关滤波目标跟踪方法及系统。The invention belongs to the technical field of target tracking, and in particular relates to a kernel correlation filtering target tracking method and system for severely occluded scenes.

背景技术Background technique

目标跟踪主要可以分为生成式算法和判别式算法,两种算法的主要区别在于目标模型建模方式的不同。Target tracking can be mainly divided into generative algorithms and discriminative algorithms. The main difference between the two algorithms lies in the way of modeling the target model.

生成式算法:此类方法首先通过提取第一帧图像中的目标特征,实现目标跟踪模型的构建,然后在后续视频图像中提取目标的候选样本,和目标模型进行相似性检测,得到与目标模型最相似的样本(即为跟踪的目标位置),并将此样本的位置作为目标的位置,最后将目标信息更新到目标模型中。这类方法存在图像背景信息利用率低、目标本身的外观变化有随机性和多样性等明显缺陷。具体表现为在光照变化、运动模糊、分辨率低、目标旋转形变等情况下,模型的建立会受到巨大的影响,从而影响跟踪的准确性;模型的建立没有有效的预测机制,当出现目标遮挡情况时,不能很好的解决。Generative algorithm: This kind of method first realizes the construction of the target tracking model by extracting the target features in the first frame image, and then extracts the candidate samples of the target in the subsequent video images, and performs similarity detection with the target model to obtain the target model. The most similar sample (that is, the tracked target position), and the position of this sample is used as the target position, and finally the target information is updated to the target model. This kind of method has obvious defects such as low utilization of image background information, randomness and diversity of the appearance of the target itself. The specific performance is that in the case of illumination changes, motion blur, low resolution, target rotation deformation, etc., the establishment of the model will be greatly affected, thereby affecting the accuracy of tracking; there is no effective prediction mechanism for the establishment of the model. The situation cannot be solved well.

判别式算法:此类方法首先使用目标正样本信息和背景负样本信息训练目标跟踪模型,然后在后续视频图像中提取候选区域,并通过判别式模型选取最优候选区域作为目标的位置,最后对判别式模型进行相应更新。与生成式方法相比,判别式方法利用背景信息训练分类器,使得分类器具有更强的辨别能力,能够更好区分前景和后景,所以判别式方法普遍要比生成模型方法好,跟踪表现鲁棒性更强,逐渐在目标跟踪领域占主流地位。Discriminant algorithm: This method first uses the target positive sample information and background negative sample information to train the target tracking model, then extracts candidate regions in the subsequent video images, and selects the optimal candidate region as the target position through the discriminative model. The discriminative model is updated accordingly. Compared with the generative method, the discriminative method uses the background information to train the classifier, which makes the classifier have stronger discrimination ability and can better distinguish the foreground and background. Therefore, the discriminative method is generally better than the generative model method, and the tracking performance is better. Robustness is stronger, and it gradually occupies the mainstream position in the field of target tracking.

核相关滤波跟踪算法是一种经典的判别式目标跟踪算法,它可以很好的适用于目标的小角度旋转、轻微遮挡以及一些其他的干扰。其主要思想是使用给定样本去训练一个判别分类器,通过计算搜索区域特征和目标区域特征之间的循环相关来得到相似度热图,热图上的最大值点的位置对应于目标在相邻两帧之间的位移,最终确定目标的移动轨迹。算法在计算过程中使用循环矩阵对样本进行采集,这使得可以使用快速傅里叶变换对计算进行加速。传统的核相关滤波算法的流程框架,可归纳如下:Kernel correlation filter tracking algorithm is a classic discriminative target tracking algorithm, which can be well applied to small-angle rotation of the target, slight occlusion and some other disturbances. The main idea is to use a given sample to train a discriminative classifier, and obtain a similarity heat map by calculating the cyclic correlation between the features of the search area and the target area. The displacement between two adjacent frames finally determines the movement trajectory of the target. The algorithm uses circulant matrices to sample samples during computations, which enables the use of fast Fourier transforms to speed up computations. The process framework of the traditional kernel correlation filtering algorithm can be summarized as follows:

在初始图像中根据给出的目标特征信息,初始化相关目标跟踪滤波器;In the initial image, according to the given target feature information, initialize the relevant target tracking filter;

读入后序视频图像,依据前一帧目标的位置和大小提取候选样本特征信息,并对样本特征数据进行余弦窗处理减少边界效应,再通过快速傅里叶变换处理,实现候选样本信息从时域到频域的转换;Read the post-sequence video image, extract the candidate sample feature information according to the position and size of the target in the previous frame, and perform cosine window processing on the sample feature data to reduce the boundary effect, and then perform fast Fourier transform processing to realize the candidate sample information from time to time. domain to frequency domain conversion;

在频率域对候选样本的特征信息和目标滤波器进行相关性计算操作,通过反向快速傅里叶变换处理得到高斯响应图,并以响应图中响应最大的位置作为跟踪目标位置;The correlation calculation operation is performed on the feature information of the candidate samples and the target filter in the frequency domain, and the Gaussian response map is obtained by inverse fast Fourier transform processing, and the position with the largest response in the response map is used as the tracking target position;

通过模型在线更新策略,将新的目标外观信息更新到目标跟踪滤波器和目标特征模型中,使得核相关滤波模型能适应目标的变化。Through the model online update strategy, the new target appearance information is updated to the target tracking filter and the target feature model, so that the kernel correlation filter model can adapt to the change of the target.

尽管核相关滤波跟踪算法在面对轻微遮挡情况时,能够完成有效跟踪,具有一定的抗干扰能力和鲁棒性,但在面对严重遮挡时,会出现跟踪失败的问题。因此,面向严重遮挡场景,本领域亟需针对现有的核相关滤波算法进行优化。Although the kernel correlation filter tracking algorithm can complete effective tracking in the face of slight occlusion, and has certain anti-interference ability and robustness, it will fail to track in the face of severe occlusion. Therefore, there is an urgent need to optimize the existing kernel correlation filtering algorithms for severe occlusion scenarios.

发明内容SUMMARY OF THE INVENTION

对于目标被严重遮挡的场景,由于核相关滤波跟踪算法没有相应的遮挡检测机制和更新停止机制,算法没有能力区分遮挡物和目标,大量遮挡物的特征信息被更新到目标模型中,这造成了目标模型的污染,最终导致目标跟踪过程的失败。本发明的目的是提供一种面向严重遮挡场景的核相关滤波目标跟踪方法及系统,来解决目标被严重遮挡时,目标跟踪发生漂移的问题。For the scene where the target is severely occluded, since the kernel correlation filter tracking algorithm has no corresponding occlusion detection mechanism and update stop mechanism, the algorithm has no ability to distinguish the occluder and the target, and the feature information of a large number of occluders is updated into the target model, which causes The pollution of the target model eventually leads to the failure of the target tracking process. The purpose of the present invention is to provide a kernel correlation filtering target tracking method and system for severely occluded scenes, so as to solve the problem of target tracking drift when the target is severely occluded.

为了实现上述发明目的,本发明采用如下技术方案:In order to realize the above-mentioned purpose of the invention, the present invention adopts the following technical solutions:

一种面向严重遮挡场景的核相关滤波跟踪方法,包括以下步骤:A kernel correlation filter tracking method for severely occluded scenes, comprising the following steps:

步骤1、初始化KCF滤波器的KCF算法和Kalman滤波器的Kalman滤波算法,以第一帧的目标位置作为KCF算法和Kalman滤波算法的初始位置;Step 1, initialize the KCF algorithm of the KCF filter and the Kalman filter algorithm of the Kalman filter, take the target position of the first frame as the initial position of the KCF algorithm and the Kalman filter algorithm;

步骤2、使用遮挡检测机制判断当前图像中是否发生目标严重遮挡;若否,则采用KCF算法的跟踪结果作为目标物体在当前图像中的位置;若是,则采用Kalman滤波算法的跟踪结果作为目标物体在当前图像中的位置;Step 2. Use the occlusion detection mechanism to determine whether the target is severely occluded in the current image; if not, use the tracking result of the KCF algorithm as the position of the target object in the current image; if so, use the tracking result of the Kalman filtering algorithm as the target object position in the current image;

步骤3、根据各自的跟踪结果分别对Kalman滤波器和KCF滤波器进行自适应更新;Step 3, according to the respective tracking results, adaptively update the Kalman filter and the KCF filter;

步骤4、读取下一个视频帧,返回执行步骤2;直至跟踪过程结束。Step 4: Read the next video frame, and return to step 2 until the tracking process ends.

作为优选方案,所述步骤2中,采用联合指标遮挡检测机制处理图像,以判断当前图像中目标是否发生严重遮挡。As a preferred solution, in the step 2, a joint index occlusion detection mechanism is used to process the image to determine whether the target in the current image is severely occluded.

作为优选方案,所述联合指标遮挡检测机制为:As a preferred solution, the joint index occlusion detection mechanism is:

(1)基于融合特征的相似性判断(1) Similarity judgment based on fusion features

特征融合指标f使用HSV和LBP特征进行构建,包括:The feature fusion index f is constructed using HSV and LBP features, including:

基于KCF算法的检测区域,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的特征向量为x*Based on the detection area of the KCF algorithm, the HSV and LBP features of the target and background areas are extracted respectively, and the extracted (HSV target , HSV background ) and (LBP target , LBP background ) are normalized using the following formulas. The processed feature vector is x=(x 1 , x 2 ,..., x n-1 , x n ) T , and the normalized feature vector is x * :

Figure BDA0003768467180000031
Figure BDA0003768467180000031

Figure BDA0003768467180000041
Figure BDA0003768467180000041

在得到归一化的特征向量后,分别计算HSVtarget与HSVbackground的欧氏距离DHSV,LBPtarget与LBPbackground的欧氏距离DLBPAfter obtaining the normalized eigenvectors, calculate the Euclidean distance D HSV between the HSV target and the HSV background , and the Euclidean distance D LBP between the LBP target and the LBP background :

Figure BDA0003768467180000042
Figure BDA0003768467180000042

Figure BDA0003768467180000043
Figure BDA0003768467180000043

其中,

Figure BDA0003768467180000044
表示目标的HSV特征向量的第i个分量值,
Figure BDA0003768467180000045
表示目标的LBP特征向量的第i个分量值,
Figure BDA0003768467180000046
表示背景区域的HSV特征向量的第i个分量值,
Figure BDA0003768467180000047
表示背景区域的LBP特征向量的第i个分量值;in,
Figure BDA0003768467180000044
represents the i-th component value of the HSV feature vector of the target,
Figure BDA0003768467180000045
represents the ith component value of the LBP feature vector of the target,
Figure BDA0003768467180000046
represents the i-th component value of the HSV feature vector of the background region,
Figure BDA0003768467180000047
represents the ith component value of the LBP feature vector of the background region;

指标D用于反应在此特征下,目标和背景之间的区分程度;Index D is used to reflect the degree of distinction between the target and the background under this feature;

HSV和LBP在融合特征中的权值计算方式如下:The weights of HSV and LBP in the fusion feature are calculated as follows:

Figure BDA0003768467180000048
Figure BDA0003768467180000048

γHSV=1-γLBP γ HSV = 1 - γ LBP

通过提取相邻两帧目标特征的HSV和LBP特征,计算得到相第k帧目标的特征相似性

Figure BDA0003768467180000049
Figure BDA00037684671800000410
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame is calculated.
Figure BDA0003768467180000049
and
Figure BDA00037684671800000410

Figure BDA00037684671800000411
Figure BDA00037684671800000411

Figure BDA00037684671800000412
Figure BDA00037684671800000412

其中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征;Among them, HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the kth frame. The LBP feature of the target in the k-1th frame;

通过权重γLBP和γHSV进行加权求和,得到特征融合f:The feature fusion f is obtained by weighted summation of the weights γ LBP and γ HSV :

Figure BDA0003768467180000051
Figure BDA0003768467180000051

利用融合特征f进行相似性判断:当执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Use fusion feature f for similarity judgment: when the Nth frame is executed, first calculate the average similarity distance s using the obtained historical similarity distance data, and calculate the corresponding threshold th 1 , the calculation formula is as follows:

Figure BDA0003768467180000052
Figure BDA0003768467180000052

th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2)

其中,δ1表示平均相似性距离s的阈值系数;Among them, δ 1 represents the threshold coefficient of the average similarity distance s;

然后计算第N-1与N帧的目标融合特征的欧式距离fN,并判断欧式距离fN是否大于th1;若否,则并未发生遮挡;若是,则可能发生遮挡;Then calculate the Euclidean distance f N of the target fusion features of the N-1th and N frames, and judge whether the Euclidean distance f N is greater than th 1 ; if not, no occlusion has occurred; if so, occlusion may occur;

记是否发生遮挡的指标为ε1,ε1表示为:The indicator of whether occlusion occurs is ε 1 , and ε 1 is expressed as:

Figure BDA0003768467180000053
Figure BDA0003768467180000053

(2)基于最大响应值的跟踪效果判断(2) Judgment of tracking effect based on maximum response value

使用最大响应值Fmax进行跟踪效果判断,包括:当执行到第N帧时,首先,统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect, including: when the execution reaches the Nth frame, first, count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , m and The calculation formula of th 2 is as follows:

Figure BDA0003768467180000054
Figure BDA0003768467180000054

th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1)

其中,

Figure BDA0003768467180000055
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数;in,
Figure BDA0003768467180000055
represents the maximum response value of the i-th frame, and δ 2 represents the threshold coefficient of the average maximum response response value m;

然后,判断第N帧的最大响应值

Figure BDA0003768467180000056
是否大于th2;若是,则当前并未发生遮挡;若否,则可能发生了遮挡;Then, determine the maximum response value of the Nth frame
Figure BDA0003768467180000056
Whether it is greater than th 2 ; if so, no occlusion has occurred; if not, occlusion may have occurred;

记是否发生遮挡的指标为ε2,ε2表示为:The indicator of whether occlusion occurs is ε 2 , and ε 2 is expressed as:

Figure BDA0003768467180000061
Figure BDA0003768467180000061

(3)基于平均相关峰值比的评估机制(3) Evaluation mechanism based on average correlation peak ratio

使用平均相关峰值比APCE进行跟踪可信度评估,包括:在响应矩阵基础上,利用平均相关峰值比评估核相关滤波跟踪方法的跟踪效果;APCE计算公式如下:Using the average correlation peak ratio APCE to evaluate the tracking reliability, including: on the basis of the response matrix, using the average correlation peak ratio to evaluate the tracking effect of the nuclear correlation filter tracking method; the APCE calculation formula is as follows:

Figure BDA0003768467180000062
Figure BDA0003768467180000062

其中,Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值;Among them, F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame;

计算前N-1帧的平均

Figure BDA0003768467180000063
和阈值th3,用于评估当前帧,即第N帧的跟踪效果,计算公式如下:Calculate the average of the first N-1 frames
Figure BDA0003768467180000063
and the threshold th 3 , which are used to evaluate the tracking effect of the current frame, that is, the Nth frame. The calculation formula is as follows:

Figure BDA0003768467180000064
Figure BDA0003768467180000064

Figure BDA0003768467180000065
Figure BDA0003768467180000065

其中,δ3表示平均相关峰值比

Figure BDA0003768467180000066
的阈值系数;where δ 3 represents the average correlation peak ratio
Figure BDA0003768467180000066
The threshold coefficient of ;

判断

Figure BDA0003768467180000067
是否大于th3;若是,则未发生遮挡;若否,则可能发生遮挡;judge
Figure BDA0003768467180000067
Whether it is greater than th 3 ; if so, no occlusion occurs; if not, occlusion may occur;

记是否发生遮挡的指标为ε3,ε3表示为:The indicator of whether occlusion occurs is ε 3 , and ε 3 is expressed as:

Figure BDA0003768467180000068
Figure BDA0003768467180000068

构建联合指标∈为:The joint index ∈ is constructed as:

∈=ε123 ∈=ε 123

当∈=0或1时,表示目标未被遮挡或轻微遮挡,使用KCF算法的结果作为跟踪结果;When ∈=0 or 1, it means that the target is not occluded or slightly occluded, and the result of the KCF algorithm is used as the tracking result;

当∈=2或3时,表示目标被严重遮挡,使用Kalman滤波算法的结果作为跟踪结果。When ∈=2 or 3, it means that the target is severely occluded, and the result of the Kalman filtering algorithm is used as the tracking result.

作为优选方案,所述步骤3,包括:As a preferred solution, the step 3 includes:

若使用KCF算法的结果作为跟踪结果,则进行KCF滤波器更新,同时还将KCF算法的结果作为Kalman测量值对Kalman滤波器进行更新;If the result of the KCF algorithm is used as the tracking result, the KCF filter is updated, and the Kalman filter is updated with the result of the KCF algorithm as the Kalman measurement value;

若使用Kalman滤波算法的结果作为跟踪结果,则对Kalman滤波器进行更新,KCF滤波器停止更新。If the result of the Kalman filter algorithm is used as the tracking result, the Kalman filter is updated, and the KCF filter is stopped.

作为优选方案,所述KCF滤波器的更新,包括:As a preferred solution, the update of the KCF filter includes:

αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1

xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1

Figure BDA0003768467180000071
Figure BDA0003768467180000071

其中,ρ表示KCF滤波器参数的更新系数;αt-1和αt表示第t-1帧和第t帧的KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器选用的目标模型的参数。Among them, ρ represents the update coefficient of the KCF filter parameters; αt -1 and αt represent the KCF filter coefficients of the t-1th frame and the tth frame, and xt-1 and xt represent the t-1th frame and the t- th frame. The parameters of the target model selected by the KCF filter of the t frame.

本发明还提供一种面向严重遮挡场景的核相关滤波跟踪系统,包括:The present invention also provides a kernel correlation filtering tracking system for severely occluded scenes, including:

初始化模块,用于初始化KCF滤波器的KCF算法和Kalman滤波器的Kalman滤波算法,以第一帧的目标位置作为KCF算法和Kalman滤波算法的初始位置;The initialization module is used to initialize the KCF algorithm of the KCF filter and the Kalman filter algorithm of the Kalman filter, and the target position of the first frame is used as the initial position of the KCF algorithm and the Kalman filter algorithm;

判断模块,用于使用遮挡检测机制判断当前图像中是否发生目标严重遮挡;若否,则采用KCF算法的跟踪结果作为目标物体在当前图像中的位置;若是,则采用Kalman滤波算法的跟踪结果作为目标物体在当前图像中的位置;The judgment module is used to use the occlusion detection mechanism to judge whether the target is severely occluded in the current image; if not, the tracking result of the KCF algorithm is used as the position of the target object in the current image; if so, the tracking result of the Kalman filtering algorithm is used as the The position of the target object in the current image;

更新模块,用于根据各自的跟踪结果分别对Kalman滤波器和KCF滤波器进行自适应更新;The update module is used to adaptively update the Kalman filter and the KCF filter according to the respective tracking results;

执行模块,用于读取下一个视频帧,返回执行步骤2;直至跟踪过程结束。The execution module is used for reading the next video frame, and returns to step 2 until the tracking process ends.

作为优选方案,所述判断模块采用联合指标遮挡检测机制处理图像,以判断当前图像中目标是否发生严重遮挡。As a preferred solution, the judging module uses a joint index occlusion detection mechanism to process the image, so as to judge whether the target in the current image is severely occluded.

作为优选方案,所述联合指标遮挡检测机制为:As a preferred solution, the joint index occlusion detection mechanism is:

(1)基于融合特征的相似性判断(1) Similarity judgment based on fusion features

特征融合指标f使用HSV和LBP特征进行构建,包括:The feature fusion index f is constructed using HSV and LBP features, including:

基于KCF算法的检测区域,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的特征向量为x*Based on the detection area of the KCF algorithm, the HSV and LBP features of the target and background areas are extracted respectively, and the extracted (HSV target , HSV background ) and (LBP target , LBP background ) are normalized using the following formulas. The processed feature vector is x=(x 1 , x 2 ,..., x n-1 , x n ) T , and the normalized feature vector is x * :

Figure BDA0003768467180000081
Figure BDA0003768467180000081

Figure BDA0003768467180000082
Figure BDA0003768467180000082

在得到归一化的特征向量后,分别计算HSVtarget与HSVbackground的欧氏距离DHSV,LBPtarget与LBPbackground的欧氏距离DLBPAfter obtaining the normalized eigenvectors, calculate the Euclidean distance D HSV between the HSV target and the HSV background , and the Euclidean distance D LBP between the LBP target and the LBP background :

Figure BDA0003768467180000083
Figure BDA0003768467180000083

Figure BDA0003768467180000084
Figure BDA0003768467180000084

其中,

Figure BDA0003768467180000085
表示目标的HSV特征向量的第i个分量值,
Figure BDA0003768467180000086
表示目标的LBP特征向量的第i个分量值,
Figure BDA0003768467180000087
表示背景区域的HSV特征向量的第i个分量值,
Figure BDA0003768467180000091
表示背景区域的LBP特征向量的第i个分量值;in,
Figure BDA0003768467180000085
represents the i-th component value of the HSV feature vector of the target,
Figure BDA0003768467180000086
represents the ith component value of the LBP feature vector of the target,
Figure BDA0003768467180000087
represents the i-th component value of the HSV feature vector of the background region,
Figure BDA0003768467180000091
represents the ith component value of the LBP feature vector of the background region;

指标D用于反应在此特征下,目标和背景之间的区分程度;Index D is used to reflect the degree of distinction between the target and the background under this feature;

HSV和LBP在融合特征中的权值计算方式如下:The weights of HSV and LBP in the fusion feature are calculated as follows:

Figure BDA0003768467180000092
Figure BDA0003768467180000092

γHSV=1-γLBP γ HSV = 1 - γ LBP

通过提取相邻两帧目标特征的HSV和LBP特征,计算得到相第k帧目标的特征相似性

Figure BDA0003768467180000093
Figure BDA0003768467180000094
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame is calculated.
Figure BDA0003768467180000093
and
Figure BDA0003768467180000094

Figure BDA0003768467180000095
Figure BDA0003768467180000095

Figure BDA0003768467180000096
Figure BDA0003768467180000096

其中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征;Among them, HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the kth frame. The LBP feature of the target in the k-1th frame;

通过权重γLBP和γHSV进行加权求和,得到特征融合f:The feature fusion f is obtained by weighted summation of the weights γ LBP and γ HSV :

Figure BDA0003768467180000097
Figure BDA0003768467180000097

利用融合特征f进行相似性判断:当执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Use fusion feature f for similarity judgment: when the Nth frame is executed, first calculate the average similarity distance s using the obtained historical similarity distance data, and calculate the corresponding threshold th 1 , the calculation formula is as follows:

Figure BDA0003768467180000098
Figure BDA0003768467180000098

th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2)

其中,δ1表示平均相似性距离s的阈值系数;Among them, δ 1 represents the threshold coefficient of the average similarity distance s;

然后计算第N-1与N帧的目标融合特征的欧式距离fN,并判断欧式距离fN是否大于th1;若否,则并未发生遮挡;若是,则可能发生遮挡;Then calculate the Euclidean distance f N of the target fusion features of the N-1th and N frames, and judge whether the Euclidean distance f N is greater than th 1 ; if not, no occlusion has occurred; if so, occlusion may occur;

记是否发生遮挡的指标为ε1,ε1表示为:The indicator of whether occlusion occurs is ε 1 , and ε 1 is expressed as:

Figure BDA0003768467180000101
Figure BDA0003768467180000101

(2)基于最大响应值的跟踪效果判断(2) Judgment of tracking effect based on maximum response value

使用最大响应值Fmax进行跟踪效果判断,包括:当执行到第N帧时,首先,统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect, including: when the execution reaches the Nth frame, first, count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , m and The calculation formula of th 2 is as follows:

Figure BDA0003768467180000102
Figure BDA0003768467180000102

th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1)

其中,

Figure BDA0003768467180000103
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数;in,
Figure BDA0003768467180000103
represents the maximum response value of the i-th frame, and δ 2 represents the threshold coefficient of the average maximum response response value m;

然后,判断第N帧的最大响应值

Figure BDA0003768467180000104
是否大于th2;若是,则当前并未发生遮挡;若否,则可能发生了遮挡;Then, determine the maximum response value of the Nth frame
Figure BDA0003768467180000104
Whether it is greater than th 2 ; if so, no occlusion has occurred; if not, occlusion may have occurred;

记是否发生遮挡的指标为ε2,ε2表示为:The indicator of whether occlusion occurs is ε 2 , and ε 2 is expressed as:

Figure BDA0003768467180000105
Figure BDA0003768467180000105

(3)基于平均相关峰值比的评估机制(3) Evaluation mechanism based on average correlation peak ratio

使用平均相关峰值比APCE进行跟踪可信度评估,包括:在响应矩阵基础上,利用平均相关峰值比评估核相关滤波跟踪方法的跟踪效果;APCE计算公式如下:Using the average correlation peak ratio APCE to evaluate the tracking reliability, including: on the basis of the response matrix, using the average correlation peak ratio to evaluate the tracking effect of the nuclear correlation filter tracking method; the APCE calculation formula is as follows:

Figure BDA0003768467180000106
Figure BDA0003768467180000106

其中,Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值;Among them, F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame;

计算前N-1帧的平均

Figure BDA0003768467180000107
和阈值th3,用于评估当前帧,即第N帧的跟踪效果,计算公式如下:Calculate the average of the first N-1 frames
Figure BDA0003768467180000107
and the threshold th 3 , which are used to evaluate the tracking effect of the current frame, that is, the Nth frame. The calculation formula is as follows:

Figure BDA0003768467180000111
Figure BDA0003768467180000111

Figure BDA0003768467180000112
Figure BDA0003768467180000112

其中,δ3表示平均相关峰值比

Figure BDA0003768467180000113
的阈值系数;where δ 3 represents the average correlation peak ratio
Figure BDA0003768467180000113
The threshold coefficient of ;

判断

Figure BDA0003768467180000114
是否大于th3;若是,则未发生遮挡;若否,则可能发生遮挡;judge
Figure BDA0003768467180000114
Whether it is greater than th 3 ; if so, no occlusion occurs; if not, occlusion may occur;

记是否发生遮挡的指标为ε3,ε3表示为:The indicator of whether occlusion occurs is ε 3 , and ε 3 is expressed as:

Figure BDA0003768467180000115
Figure BDA0003768467180000115

构建联合指标∈为:The joint index ∈ is constructed as:

∈=ε123 ∈=ε 123

当∈=0或1时,表示目标未被遮挡或轻微遮挡,使用KCF算法的结果作为跟踪结果;When ∈=0 or 1, it means that the target is not occluded or slightly occluded, and the result of the KCF algorithm is used as the tracking result;

当∈=2或3时,表示目标被严重遮挡,使用Kalman滤波算法的结果作为跟踪结果。When ∈=2 or 3, it means that the target is severely occluded, and the result of the Kalman filtering algorithm is used as the tracking result.

作为优选方案,所述更新模块的更新过程,包括:As a preferred solution, the update process of the update module includes:

若使用KCF算法的结果作为跟踪结果,则进行KCF滤波器更新,同时还将KCF算法的结果作为Kalman测量值对Kalman滤波器进行更新;If the result of the KCF algorithm is used as the tracking result, the KCF filter is updated, and the Kalman filter is updated with the result of the KCF algorithm as the Kalman measurement value;

若使用Kalman滤波算法的结果作为跟踪结果,则对Kalman滤波器进行更新,KCF滤波器停止更新。If the result of the Kalman filter algorithm is used as the tracking result, the Kalman filter is updated, and the KCF filter is stopped.

作为优选方案,所述KCF滤波器的更新,包括:As a preferred solution, the update of the KCF filter includes:

αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1

xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1

Figure BDA0003768467180000116
Figure BDA0003768467180000116

其中,ρ表示KCF滤波器参数的更新系数;αt-1和αt表示第t-1帧和第t帧的KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器选用的目标模型的参数。Among them, ρ represents the update coefficient of the KCF filter parameters; αt -1 and αt represent the KCF filter coefficients of the t-1th frame and the tth frame, and xt-1 and xt represent the t-1th frame and the t- th frame. The parameters of the target model selected by the KCF filter of the t frame.

本发明与现有技术相比,有益效果是:Compared with the prior art, the present invention has the following beneficial effects:

(1)针对核相关滤波算法遮挡检测机制缺失问题,提出了联合指标遮挡检测机制,使得可以区分正常场景和遮挡场景;(1) Aiming at the lack of the occlusion detection mechanism of the kernel correlation filtering algorithm, a joint index occlusion detection mechanism is proposed, which makes it possible to distinguish between normal scenes and occlusion scenes;

(2)设计了遮挡情况下模型的自适应更新策略,使得特征模型在线更新的同时不会被遮挡物严重污染;(2) The adaptive update strategy of the model in the case of occlusion is designed, so that the feature model is updated online without being seriously polluted by the occluder;

(3)增加了遮挡情况下Kalman目标跟踪处理机制,使得即使目标被严重遮挡,仍能实现对目标的跟踪。(3) The Kalman target tracking processing mechanism under occlusion is added, so that the target can still be tracked even if the target is severely occluded.

附图说明Description of drawings

图1为本发明实施例的面向严重遮挡场景的核相关滤波目标跟踪方法的流程图;FIG. 1 is a flowchart of a kernel correlation filtering target tracking method for severely occluded scenes according to an embodiment of the present invention;

图2为本发明实施例的连续帧中相似性距离fN和阈值th1的关系;Fig. 2 is the relationship between similarity distance fN and threshold th 1 in consecutive frames according to an embodiment of the present invention;

图3为本发明实施例的连续帧中最大响应值Fmax和阈值th2的关系;Fig. 3 is the relationship between the maximum response value Fmax and the threshold th 2 in consecutive frames according to an embodiment of the present invention;

图4为本发明实施例的连续帧中平均相关峰值比ACPE和阈值th3的关系;4 is the relationship between the average correlation peak ratio ACPE and the threshold th 3 in consecutive frames according to an embodiment of the present invention;

图5为本发明实施例的连续帧中遮挡前后联合遮挡检测机制各参数变化;Fig. 5 is the change of each parameter of the joint occlusion detection mechanism before and after occlusion in consecutive frames according to an embodiment of the present invention;

图6为本发明实施例的面向严重遮挡场景的核相关滤波目标跟踪系统的构架图。FIG. 6 is a structural diagram of a kernel correlation filtering target tracking system for severely occluded scenes according to an embodiment of the present invention.

具体实施方式Detailed ways

为了更清楚地说明本发明实施例,下面将对照附图说明本发明的具体实施方式。显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图,并获得其他的实施方式。In order to describe the embodiments of the present invention more clearly, the following will describe specific embodiments of the present invention with reference to the accompanying drawings. Obviously, the accompanying drawings in the following description are only some embodiments of the present invention. For those of ordinary skill in the art, other drawings can also be obtained from these drawings without creative efforts, and obtain other implementations.

本发明实施例的面向严重遮挡场景的核相关滤波目标跟踪方法主要对现有框架进行了以下三个方面的优化:The kernel correlation filtering target tracking method for severely occluded scenes in the embodiment of the present invention mainly optimizes the existing framework in the following three aspects:

一、针对遮挡情况的检测机制缺失问题,提出了联合指标遮挡检测机制;1. In view of the lack of detection mechanism for occlusion, a joint index occlusion detection mechanism is proposed;

二、设计了遮挡情况下模型的自适应更新策略;2. Design the adaptive update strategy of the model in the case of occlusion;

三、增加了遮挡情况下Kalman目标跟踪处理机制。3. Added the Kalman target tracking processing mechanism in the case of occlusion.

具体地,如图1所示,本发明实施例的面向严重遮挡场景的核相关滤波目标跟踪方法,包括以下步骤:Specifically, as shown in FIG. 1 , the kernel correlation filtering target tracking method for severely occluded scenes according to the embodiment of the present invention includes the following steps:

步骤1:初始化KCF目标跟踪算法和Kalman算法,以第一帧的目标位置作为KCF和Kalman两个算法的初始位置。Step 1: Initialize the KCF target tracking algorithm and the Kalman algorithm, and use the target position of the first frame as the initial position of the KCF and Kalman algorithms.

Kalman算法使用上一帧目标框的位置和目标移动速度,对下一帧目标的位置进行预测跟踪处理。在实时视频的目标跟踪任务中,每两帧时间间隔比较小,可以认为目标在相邻图像在运动变化缓慢,目标运动可近似为匀速直线运动,下一帧中目标的位置和移动速度可由动力学公式所得:The Kalman algorithm uses the position of the target frame in the previous frame and the moving speed of the target to predict and track the position of the target in the next frame. In the target tracking task of real-time video, the time interval between every two frames is relatively small. It can be considered that the target moves slowly in adjacent images, and the target motion can be approximated as a uniform linear motion. The position and moving speed of the target in the next frame can be determined by the dynamic Learn the formula:

xk=xk-1+vk-1Δtx k =x k-1 +v k-1 Δt

vk=vk-1 v k = v k-1

其中,Δt为两帧时间间隔,xk-1表示目标在第k-1帧中的位置,vk-1表示目标在第k-1帧中的速度。因系统为线性动态模型,以二维情况为例,系统转移状态方程如下:Among them, Δt is the time interval of two frames, x k-1 represents the position of the target in the k-1th frame, and vk-1 represents the speed of the target in the k-1th frame. Because the system is a linear dynamic model, taking the two-dimensional case as an example, the system transition state equation is as follows:

X(k)=AX(k-1)+G(k)X(k)=AX(k-1)+G(k)

Figure BDA0003768467180000131
Figure BDA0003768467180000131

其中,A表示物体的运动转移矩阵,X(k)=(xk,yk,vx,vy)T,xk表示目标在第k-1帧中x轴的位置,yk表示目标在第k-1帧中y轴的位置,vx表示目标在x轴的运动速度,vy表示目标在y轴的运动速度,Δt表示视频两次检测的周期;

Figure BDA0003768467180000141
表示速率变动,在匀速直线运动中可视为运动过程中的高斯白噪声,
Figure BDA0003768467180000142
Figure BDA0003768467180000143
均服从N(0,1)分布。Among them, A represents the motion transfer matrix of the object, X(k)=(x k , y k , v x , vy ) T , x k represents the position of the target on the x-axis in the k-1th frame, and y k represents the target In the position of the y-axis in the k-1th frame, v x represents the movement speed of the target on the x-axis, v y represents the movement speed of the target on the y-axis, and Δt represents the period of two detections of the video;
Figure BDA0003768467180000141
Represents the rate change, which can be regarded as Gaussian white noise in the process of motion in uniform linear motion,
Figure BDA0003768467180000142
and
Figure BDA0003768467180000143
All obey the N(0,1) distribution.

系统测量值计算公式为:The system measurement value calculation formula is:

Y(k)=HX(k)+C(k)Y(k)=HX(k)+C(k)

式中,H表示测量系统的观测矩阵,C(k)表示测量过程的噪声。In the formula, H represents the observation matrix of the measurement system, and C(k) represents the noise of the measurement process.

目标位置和与之相对应的误差协方差矩阵P可使用卡尔曼滤波器预测得出:The target position and its corresponding error covariance matrix P can be predicted using the Kalman filter:

Figure BDA0003768467180000144
Figure BDA0003768467180000144

式中,Q表示运动噪声的协方差。结合预测值和测量值,可以得到现在状态k的最优化估算值为:where Q is the covariance of motion noise. Combining the predicted value and the measured value, the optimal estimate of the current state k can be obtained as:

Figure BDA0003768467180000145
Figure BDA0003768467180000145

并更新k状态下X(k|k)的误差协方差:And update the error covariance of X(k|k) in k states:

P(k|k)=P(k|k-1)-Kk HP(k|k-1)P(k|k)=P(k|k-1)-K k HP(k|k-1)

其中,Kk为卡尔曼增益,其计算公式如下:Among them, K k is the Kalman gain, and its calculation formula is as follows:

Kk=P(k|k-1)HT/(HP(k|k-1)HT+Rk)K k =P(k|k-1)H T /(HP(k|k-1)H T +R k )

其中,Rk是指测量噪声的协方差,测量协方差的值较低时意味着在当前的测量值上具有更大的加权,即认为测量值的可信度更高。Among them, R k refers to the covariance of measurement noise, and when the value of measurement covariance is lower, it means that there is a greater weight on the current measurement value, that is, the reliability of the measurement value is considered to be higher.

Kalman滤波算法的整体流程简述如下:The overall process of the Kalman filtering algorithm is briefly described as follows:

(1)确定系统的状态转移矩阵A与测量矩阵H。(1) Determine the state transition matrix A and measurement matrix H of the system.

(2)初始化协方差矩阵初值P(0|0)与状态量初值X(0)。(2) Initialize the initial value of the covariance matrix P(0|0) and the initial value of the state quantity X(0).

(3)根据状态转移矩阵A,状态递推方程和协方差矩阵,确定下一时刻的状态量

Figure BDA0003768467180000146
和协方差矩阵P(k|k-1),并更新卡尔曼增益Kk。(3) According to the state transition matrix A, the state recursion equation and the covariance matrix, determine the state quantity at the next moment
Figure BDA0003768467180000146
and the covariance matrix P(k|k-1), and update the Kalman gain K k .

(4)根据测量向量Y(k)、卡尔曼增益Kk以及状态量

Figure BDA0003768467180000151
修正状态量,得到该更新周期的状态估计修正值
Figure BDA0003768467180000152
即X(k)。(4) According to the measurement vector Y(k), the Kalman gain K k and the state quantity
Figure BDA0003768467180000151
Correct the state quantity to obtain the state estimate correction value of the update period
Figure BDA0003768467180000152
That is X(k).

(5)根据状态估计修正值X(k),测量值计算公式,更新协方差矩阵,得到协方差阵修正值P(k|k)。(5) According to the state estimation correction value X(k) and the measurement value calculation formula, update the covariance matrix to obtain the covariance matrix correction value P(k|k).

(6)回到上述步骤(3),重复上述步骤,直到追踪过程结束。(6) Go back to the above step (3) and repeat the above steps until the tracking process ends.

步骤2:使用联合指标遮挡检测机制处理图像,判断当前图像中目标是否发生严重遮挡情况。如果不存在目标遮挡,则采用KCF算法的跟踪结果作为目标物体在当前帧中的位置;如果目标被严重遮挡,采用Kalman算法的预测结果作为目标在当前帧中的位置。Step 2: Use the joint index occlusion detection mechanism to process the image to determine whether the target in the current image is severely occluded. If there is no target occlusion, the tracking result of the KCF algorithm is used as the position of the target object in the current frame; if the target is severely occluded, the prediction result of the Kalman algorithm is used as the position of the target in the current frame.

本发明实施例的联合指标遮挡检测机制详细描述如下:The joint index occlusion detection mechanism according to the embodiment of the present invention is described in detail as follows:

1)基于融合特征的相似性判断1) Similarity judgment based on fusion features

特征融合指标f使用HSV和LBP特征进行构建,具体方式如下:在KCF算法的检测区域基础上,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的向量为x*The feature fusion index f is constructed using HSV and LBP features. The specific method is as follows: Based on the detection region of the KCF algorithm, the HSV and LBP features of the target and background regions are extracted respectively, and the extracted (HSV target , HSV background ) and ( LBP target , LBP background ) are normalized using the following formula, and the feature vector to be processed is x=(x 1 , x 2 ,..., x n-1 , x n ) T , after normalization A vector of x * :

Figure BDA0003768467180000153
Figure BDA0003768467180000153

Figure BDA0003768467180000154
Figure BDA0003768467180000154

在得到归一化的特征向量后,分别使用下述公式分别计算HSVtarget和HSVbackground的欧氏距离dHSV,LBPtarget和LBPbackground的欧氏距离dLBPAfter obtaining the normalized eigenvectors, use the following formulas to calculate the Euclidean distance d HSV of HSV target and HSV background respectively, and the Euclidean distance d LBP of LBP target and LBP background :

Figure BDA0003768467180000161
Figure BDA0003768467180000161

Figure BDA0003768467180000162
Figure BDA0003768467180000162

式中,

Figure BDA0003768467180000163
表示目标的LBP特征向量的第i个分量值,同样地有
Figure BDA0003768467180000164
表示背景区域的LBP特征向量的第i个分量值,同样地有
Figure BDA0003768467180000165
In the formula,
Figure BDA0003768467180000163
Represents the i-th component value of the LBP feature vector of the target, and also has
Figure BDA0003768467180000164
Represents the i-th component value of the LBP feature vector of the background area, and also has
Figure BDA0003768467180000165

指标D用于反应在此特征下,目标和背景之间的区分程度。目标和背景之间的区分度越高,说明使用此特征构建的目标模型越准确,所以构建的融合特征应包含此特征的更多信息。HSV和LBP在融合特征中的权值计算方式如下:The index D is used to reflect the degree of distinction between the target and the background under this feature. The higher the discrimination between the target and the background, the more accurate the target model constructed using this feature, so the constructed fusion feature should contain more information about this feature. The weights of HSV and LBP in the fusion feature are calculated as follows:

Figure BDA0003768467180000166
Figure BDA0003768467180000166

γHSV=1-γLBP γ HSV = 1 - γ LBP

通过提取相邻两帧目标特征的HSV和LBP特征,可以通过如下公式计算得到相第k帧目标的特征相似性

Figure BDA0003768467180000167
Figure BDA0003768467180000168
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame can be calculated by the following formula
Figure BDA0003768467180000167
and
Figure BDA0003768467180000168

Figure BDA0003768467180000169
Figure BDA0003768467180000169

Figure BDA00037684671800001610
Figure BDA00037684671800001610

式中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征。再通过上文求得的权重γLBP和γHSV进行加权求和,就能对两个特征进行融合,得到特征融合f,f的公式如下所示:where HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the LBP features of the target in the kth frame and the k-1th frame. Then through the weighted summation of the weights γ LBP and γ HSV obtained above, the two features can be fused to obtain the feature fusion f, and the formula of f is as follows:

Figure BDA00037684671800001611
Figure BDA00037684671800001611

接下来使用得到的融合特征f进行相似性判断:当算法执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Next, the obtained fusion feature f is used for similarity judgment: when the algorithm executes to the Nth frame, the average similarity distance s is first calculated using the obtained historical similarity distance data, and the corresponding threshold th 1 is calculated. The calculation formula is as follows:

Figure BDA0003768467180000171
Figure BDA0003768467180000171

th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2)

其中,δ1表示平均相似性距离s的阈值系数,一般选取δ1=1.5。然后计算第N-1与N帧的目标融合特征的欧式距离fN,当fN小于th1时,则说明近时间段内并未发生遮挡,相反,则说明可能发生了遮挡。记在此特征下,是否发生遮挡的指标为ε1,当目标可能被遮挡时,ε1取值为1;若未发生遮挡,ε1取值为0,ε1可表示为:Among them, δ 1 represents the threshold coefficient of the average similarity distance s, and generally, δ 1 =1.5 is selected. Then calculate the Euclidean distance f N of the target fusion features of frames N-1 and N. When f N is less than th 1 , it means that no occlusion has occurred in the near time period. On the contrary, it means that occlusion may have occurred. Under this feature, the indicator of whether occlusion occurs is ε 1 . When the target may be occluded, ε 1 takes the value of 1; if there is no occlusion, ε 1 takes the value of 0, and ε 1 can be expressed as:

Figure BDA0003768467180000172
Figure BDA0003768467180000172

在连续帧中,相似性距离fN和阈值th1的关系如图2所示。In consecutive frames, the relationship between the similarity distance fN and the threshold th 1 is shown in Figure 2.

2)基于最大响应值的跟踪效果判断2) Judgment of tracking effect based on maximum response value

使用最大响应值Fmax进行跟踪效果判断:当运行到第N帧时,首先统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect: when running to the Nth frame, first count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , and calculate m and th 2 The formula is as follows:

Figure BDA0003768467180000173
Figure BDA0003768467180000173

th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1)

其中,

Figure BDA0003768467180000174
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数,一般选取δ2=0.6。然后比较第N帧的最大响应值
Figure BDA0003768467180000175
和th2的大小关系,如果最大响应值
Figure BDA0003768467180000176
大于th2,则说明当前并未发生遮挡。如果最大响应值
Figure BDA0003768467180000177
小于th2,则说明近段时间内可能发生了遮挡。记在此特征下,是否发生遮挡的指标为ε2,当目标可能被遮挡时,ε2取值为1;若未发生遮挡,ε2取值为0,ε2可表示为:in,
Figure BDA0003768467180000174
represents the maximum response value of the ith frame, and δ 2 represents the threshold coefficient of the average maximum response response value m, generally δ 2 =0.6. Then compare the maximum response value of the Nth frame
Figure BDA0003768467180000175
and the size of th 2 , if the maximum response value
Figure BDA0003768467180000176
If it is greater than th 2 , it means that there is no occlusion currently. If the maximum response value
Figure BDA0003768467180000177
If it is less than th2, it means that occlusion may have occurred in the recent period. Under this feature, the indicator of whether occlusion occurs is ε 2 . When the target may be occluded, ε 2 takes the value of 1; if there is no occlusion, ε 2 takes the value of 0, and ε 2 can be expressed as:

Figure BDA0003768467180000181
Figure BDA0003768467180000181

连续帧中,最大响应值Fmax和阈值th2的关系如图3所示。In consecutive frames, the relationship between the maximum response value F max and the threshold value th 2 is shown in FIG. 3 .

3)基于平均相关峰值比的评估机制3) Evaluation mechanism based on average correlation peak ratio

使用平均相关峰值比APCE进行跟踪可信度评估:在响应矩阵基础上,Wang等人提出了平均相关峰值比(APCE),用于评估核相关滤波跟踪算法的跟踪效果,APCE计算公式如下,其中Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值。Use the average correlation peak ratio APCE to evaluate the tracking reliability: On the basis of the response matrix, Wang et al. proposed the average correlation peak ratio (APCE) to evaluate the tracking effect of the nuclear correlation filter tracking algorithm. The APCE calculation formula is as follows, where F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame.

Figure BDA0003768467180000182
Figure BDA0003768467180000182

计算前N-1帧的平均

Figure BDA0003768467180000183
和阈值th3,用于评估当前帧(第N帧)的跟踪效果,计算公式如下:Calculate the average of the first N-1 frames
Figure BDA0003768467180000183
and the threshold th 3 , which is used to evaluate the tracking effect of the current frame (the Nth frame). The calculation formula is as follows:

Figure BDA0003768467180000184
Figure BDA0003768467180000184

Figure BDA0003768467180000185
Figure BDA0003768467180000185

其中,δ3表示平均相关峰值比

Figure BDA0003768467180000186
的阈值系数,一般选取δ3=0.5。记在此特征下,是否发生遮挡的指标为ε3,当目标可能被遮挡时,当ε3取值为1;若未发生遮挡,ε3取值为0,ε3可表示为:where δ 3 represents the average correlation peak ratio
Figure BDA0003768467180000186
The threshold coefficient of , generally choose δ 3 =0.5. Under this feature, the indicator of whether occlusion occurs is ε 3 . When the target may be occluded, ε 3 takes the value of 1; if there is no occlusion, ε 3 takes the value of 0, and ε 3 can be expressed as:

Figure BDA0003768467180000187
Figure BDA0003768467180000187

连续帧中,平均相关峰值比ACPE和阈值th3的关系如图4所示。In consecutive frames, the relationship between the average correlation peak ratio ACPE and the threshold th 3 is shown in Fig. 4 .

最后结合上述三种指标,来构造联合指标∈,实现目标遮挡的检测,其公式如下所示:Finally, the above three indicators are combined to construct a joint indicator ∈ to realize the detection of target occlusion. The formula is as follows:

∈=ε123 ∈=ε 123

其中ε1表示基于融合特征f反应的跟踪情况,其中ε2表示基于最大响应值Fmax反应的跟踪情况,其中ε3表示基于平均相关峰值比APCE反应的跟踪情况。当∈=0 or 1时,表示目标未被遮挡或者轻微遮挡,不会对KCF滤波器造成太大的污染,滤波器进行正常更新即可,并使用KCF算法的结果作为跟踪结果。当∈=2 or 3时,表示目标被严重遮挡,需要执行其他操作,并使用Kalman算法的结果作为跟踪结果。在连续视频帧中,目标遮挡前后,上述f、Fmax和APCE变化如图5所示。where ε 1 represents the tracking situation based on the fusion feature f response, where ε 2 represents the tracking situation based on the maximum response value F max response, and ε 3 represents the tracking situation based on the average correlation peak ratio APCE response. When ∈=0 or 1, it means that the target is not occluded or slightly occluded, which will not cause too much pollution to the KCF filter. The filter can be updated normally, and the result of the KCF algorithm is used as the tracking result. When ∈=2 or 3, it means that the target is severely occluded, other operations need to be performed, and the result of the Kalman algorithm is used as the tracking result. In consecutive video frames, before and after target occlusion, the above f, Fmax and APCE changes are shown in Figure 5.

步骤3:根据遮挡检测机制的结果,使用步骤2的结果对Kalman滤波器和KCF滤波器进行自适应更新。如果不存在目标遮挡,则使用KCF算法的结果作为Kalman测量值(即将测量向量Y(k)设置为KCF的结果),对Kalman滤波器进行更新,并同时进行KCF滤波器的常规更新;如果出现目标被遮挡的情况时,则停止KCF滤波器的更新,并将Kalman滤波器的预测值作为Kalman测量值(即将测量向量Y(k)设置为与先验状态

Figure BDA0003768467180000192
一致),对Kalman滤波器进行更新。Step 3: According to the results of the occlusion detection mechanism, use the results of Step 2 to adaptively update the Kalman filter and the KCF filter. If there is no target occlusion, the result of the KCF algorithm is used as the Kalman measurement value (that is, the measurement vector Y(k) is set as the result of KCF), the Kalman filter is updated, and the regular update of the KCF filter is performed at the same time; if there is When the target is blocked, the update of the KCF filter is stopped, and the predicted value of the Kalman filter is used as the Kalman measurement value (that is, the measurement vector Y(k) is set to be the same as the prior state.
Figure BDA0003768467180000192
Consistent), update the Kalman filter.

对KCF滤波器而言,自适应更新策略是在原策略基础上,加入了停止机制。θ是判定是否进行更新的指标,∈为遮挡检测机制的结果,其判定过程为:For the KCF filter, the adaptive update strategy is based on the original strategy, adding a stopping mechanism. θ is the index to determine whether to update, ∈ is the result of the occlusion detection mechanism, and the determination process is as follows:

Figure BDA0003768467180000191
Figure BDA0003768467180000191

当没有遮挡或者发生轻微遮挡(θ=1),此时的KCF滤波器更新策略和原策略一致。当发生严重遮挡(θ=0),此时KCF滤波器应停止更新,避免受到污染。对应到KCF滤波器中,其具体参数的更新公式如下所示:When there is no occlusion or slight occlusion (θ=1), the KCF filter update strategy at this time is the same as the original strategy. When severe occlusion occurs (θ=0), the KCF filter should stop updating to avoid contamination. Corresponding to the KCF filter, the update formula of its specific parameters is as follows:

αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1

xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1

其中,ρ表示KCF滤波器参数的的更新系数,一般为0.8。αt-1和αt表示第t-1帧和第t帧KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器中选定的目标模型的参数,可参考现有技术,在此不赘述。Among them, ρ represents the update coefficient of KCF filter parameters, which is generally 0.8. αt -1 and αt represent the KCF filter coefficients of the t-1th and tth frames, and xt-1 and xt represent the selected target model in the KCF filter of the t-1th and tth frames. For parameters, reference may be made to the prior art, and details are not described here.

在完成自适应更新后,方法将读取下一个视频帧,回到步骤2。After the adaptive update is complete, the method will read the next video frame and go back to step 2.

直到所有视频帧都读取完,方法结束。The method ends until all video frames have been read.

基于上述本发明实施例的核相关滤波跟踪方法,如图6所示,本发明实施例还提供面向严重遮挡场景的核相关滤波跟踪系统,包括初始化模块、判断模块、更新模块和执行模块。Based on the above kernel correlation filtering and tracking method according to the embodiment of the present invention, as shown in FIG. 6 , an embodiment of the present invention also provides a kernel correlation filtering and tracking system for severely occluded scenes, including an initialization module, a judgment module, an update module and an execution module.

具体地,初始化模块用于初始化KCF滤波器的KCF算法和Kalman滤波器的Kalman滤波算法,以第一帧的目标位置作为KCF算法和Kalman滤波算法的初始位置;Specifically, the initialization module is used to initialize the KCF algorithm of the KCF filter and the Kalman filter algorithm of the Kalman filter, with the target position of the first frame as the initial position of the KCF algorithm and the Kalman filter algorithm;

判断模块用于使用遮挡检测机制判断当前图像中是否发生目标严重遮挡;若否,则采用KCF算法的跟踪结果作为目标物体在当前图像中的位置;若是,则采用Kalman滤波算法的跟踪结果作为目标物体在当前图像中的位置;The judgment module is used to use the occlusion detection mechanism to judge whether the target is severely occluded in the current image; if not, the tracking result of the KCF algorithm is used as the position of the target object in the current image; if so, the tracking result of the Kalman filtering algorithm is used as the target. the position of the object in the current image;

具体地,判断模块采用联合指标遮挡检测机制处理图像,以判断当前图像中目标是否发生严重遮挡。Specifically, the judging module uses a joint index occlusion detection mechanism to process the image to judge whether the target in the current image is severely occluded.

联合指标遮挡检测机制为:The joint index occlusion detection mechanism is:

(1)基于融合特征的相似性判断(1) Similarity judgment based on fusion features

特征融合指标f使用HSV和LBP特征进行构建,包括:The feature fusion index f is constructed using HSV and LBP features, including:

基于KCF算法的检测区域,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的特征向量为x*Based on the detection area of the KCF algorithm, the HSV and LBP features of the target and background areas are extracted respectively, and the extracted (HSV target , HSV background ) and (LBP target , LBP background ) are normalized using the following formulas. The processed feature vector is x=(x 1 , x 2 ,..., x n-1 , x n ) T , and the normalized feature vector is x * :

Figure BDA0003768467180000201
Figure BDA0003768467180000201

Figure BDA0003768467180000202
Figure BDA0003768467180000202

在得到归一化的特征向量后,分别计算HSVtarget与HSVbackground的欧氏距离DHSV,LBPtarget与LBPbackground的欧氏距离DLBPAfter obtaining the normalized eigenvectors, calculate the Euclidean distance D HSV between the HSV target and the HSV background , and the Euclidean distance D LBP between the LBP target and the LBP background :

Figure BDA0003768467180000211
Figure BDA0003768467180000211

Figure BDA0003768467180000212
Figure BDA0003768467180000212

其中,

Figure BDA0003768467180000213
表示目标的HSV特征向量的第i个分量值,
Figure BDA0003768467180000214
表示目标的LBP特征向量的第i个分量值,
Figure BDA0003768467180000215
表示背景区域的HSV特征向量的第i个分量值,
Figure BDA0003768467180000216
表示背景区域的LBP特征向量的第i个分量值;in,
Figure BDA0003768467180000213
represents the i-th component value of the HSV feature vector of the target,
Figure BDA0003768467180000214
represents the ith component value of the LBP feature vector of the target,
Figure BDA0003768467180000215
represents the i-th component value of the HSV feature vector of the background region,
Figure BDA0003768467180000216
represents the ith component value of the LBP feature vector of the background region;

指标D用于反应在此特征下,目标和背景之间的区分程度;Index D is used to reflect the degree of distinction between the target and the background under this feature;

HSV和LBP在融合特征中的权值计算方式如下:The weights of HSV and LBP in the fusion feature are calculated as follows:

Figure BDA0003768467180000217
Figure BDA0003768467180000217

γHSV=1-γLBP γ HSV = 1 - γ LBP

通过提取相邻两帧目标特征的HSV和LBP特征,计算得到相第k帧目标的特征相似性

Figure BDA0003768467180000218
Figure BDA0003768467180000219
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame is calculated.
Figure BDA0003768467180000218
and
Figure BDA0003768467180000219

Figure BDA00037684671800002110
Figure BDA00037684671800002110

Figure BDA00037684671800002111
Figure BDA00037684671800002111

其中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征;Among them, HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the kth frame. The LBP feature of the target in the k-1th frame;

通过权重γLBP和γHSV进行加权求和,得到特征融合f:The feature fusion f is obtained by weighted summation of the weights γ LBP and γ HSV :

Figure BDA00037684671800002112
Figure BDA00037684671800002112

利用融合特征f进行相似性判断:当执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Use fusion feature f for similarity judgment: when the Nth frame is executed, first calculate the average similarity distance s using the obtained historical similarity distance data, and calculate the corresponding threshold th 1 , the calculation formula is as follows:

Figure BDA0003768467180000221
Figure BDA0003768467180000221

th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2)

其中,δ1表示平均相似性距离s的阈值系数;Among them, δ 1 represents the threshold coefficient of the average similarity distance s;

然后计算第N-1与N帧的目标融合特征的欧式距离fN,并判断欧式距离fN是否大于th1;若否,则并未发生遮挡;若是,则可能发生遮挡;Then calculate the Euclidean distance f N of the target fusion features of the N-1th and N frames, and judge whether the Euclidean distance f N is greater than th 1 ; if not, no occlusion has occurred; if so, occlusion may occur;

记是否发生遮挡的指标为ε1,ε1表示为:The indicator of whether occlusion occurs is ε 1 , and ε 1 is expressed as:

Figure BDA0003768467180000222
Figure BDA0003768467180000222

(2)基于最大响应值的跟踪效果判断(2) Judgment of tracking effect based on maximum response value

使用最大响应值Fmax进行跟踪效果判断,包括:当执行到第N帧时,首先,统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect, including: when the execution reaches the Nth frame, first, count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , m and The calculation formula of th 2 is as follows:

Figure BDA0003768467180000223
Figure BDA0003768467180000223

th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1)

其中,

Figure BDA0003768467180000224
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数;in,
Figure BDA0003768467180000224
represents the maximum response value of the i-th frame, and δ 2 represents the threshold coefficient of the average maximum response response value m;

然后,判断第N帧的最大响应值

Figure BDA0003768467180000225
是否大于th2;若是,则当前并未发生遮挡;若否,则可能发生了遮挡;Then, determine the maximum response value of the Nth frame
Figure BDA0003768467180000225
Whether it is greater than th 2 ; if so, no occlusion has occurred; if not, occlusion may have occurred;

记是否发生遮挡的指标为ε2,ε2表示为:The indicator of whether occlusion occurs is ε 2 , and ε 2 is expressed as:

Figure BDA0003768467180000226
Figure BDA0003768467180000226

(3)基于平均相关峰值比的评估机制(3) Evaluation mechanism based on average correlation peak ratio

使用平均相关峰值比APCE进行跟踪可信度评估,包括:在响应矩阵基础上,利用平均相关峰值比评估核相关滤波跟踪方法的跟踪效果;APCE计算公式如下:Using the average correlation peak ratio APCE to evaluate the tracking reliability, including: on the basis of the response matrix, using the average correlation peak ratio to evaluate the tracking effect of the nuclear correlation filter tracking method; the APCE calculation formula is as follows:

Figure BDA0003768467180000231
Figure BDA0003768467180000231

其中,Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值;Among them, F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame;

计算前N-1帧的平均

Figure BDA0003768467180000232
和阈值th3,用于评估当前帧,即第N帧的跟踪效果,计算公式如下:Calculate the average of the first N-1 frames
Figure BDA0003768467180000232
and the threshold th 3 , which are used to evaluate the tracking effect of the current frame, that is, the Nth frame. The calculation formula is as follows:

Figure BDA0003768467180000233
Figure BDA0003768467180000233

Figure BDA0003768467180000234
Figure BDA0003768467180000234

其中,δ3表示平均相关峰值比

Figure BDA0003768467180000235
的阈值系数;where δ3 represents the average correlation peak ratio
Figure BDA0003768467180000235
The threshold coefficient of ;

判断

Figure BDA0003768467180000236
是否大于th3;若是,则未发生遮挡;若否,则可能发生遮挡;judge
Figure BDA0003768467180000236
Is it greater than th3; if so, no occlusion has occurred; if not, occlusion may occur;

记是否发生遮挡的指标为ε3,ε3表示为:The indicator of whether occlusion occurs is ε 3 , and ε 3 is expressed as:

Figure BDA0003768467180000237
Figure BDA0003768467180000237

构建联合指标∈为:The joint index ∈ is constructed as:

∈=ε123 ∈=ε 123

当∈=0或1时,表示目标未被遮挡或轻微遮挡,使用KCF算法的结果作为跟踪结果;When ∈=0 or 1, it means that the target is not occluded or slightly occluded, and the result of the KCF algorithm is used as the tracking result;

当∈=2或3时,表示目标被严重遮挡,使用Kalman滤波算法的结果作为跟踪结果。When ∈=2 or 3, it means that the target is severely occluded, and the result of the Kalman filtering algorithm is used as the tracking result.

本发明实施例的更新模块用于根据各自的跟踪结果分别对Kalman滤波器和KCF滤波器进行自适应更新。具体地,更新模块的更新过程,包括:The updating module in the embodiment of the present invention is used to adaptively update the Kalman filter and the KCF filter according to the respective tracking results. Specifically, the update process of the update module includes:

若使用KCF算法的结果作为跟踪结果,则进行KCF滤波器更新,同时还将KCF算法的结果作为Kalman测量值对Kalman滤波器进行更新;Kalman滤波器可以参考现有技术。If the result of the KCF algorithm is used as the tracking result, the KCF filter is updated, and at the same time, the result of the KCF algorithm is used as the Kalman measurement value to update the Kalman filter; the Kalman filter may refer to the prior art.

若使用Kalman滤波算法的结果作为跟踪结果,则对Kalman滤波器进行更新,KCF滤波器停止更新。If the result of the Kalman filter algorithm is used as the tracking result, the Kalman filter is updated, and the KCF filter is stopped.

其中,KCF滤波器的更新,包括:Among them, the update of the KCF filter, including:

αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1

xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1

Figure BDA0003768467180000241
Figure BDA0003768467180000241

其中,ρ表示KCF滤波器参数的更新系数;αt-1和αt表示第t-1帧和第t帧的KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器选用的目标模型的参数。Among them, ρ represents the update coefficient of the KCF filter parameters; αt -1 and αt represent the KCF filter coefficients of the t-1th frame and the tth frame, and xt-1 and xt represent the t-1th frame and the t- th frame. The parameters of the target model selected by the KCF filter of the t frame.

本发明实施例的执行模块用于读取下一个视频帧,返回执行上述过程,直至跟踪过程结束。The execution module in the embodiment of the present invention is configured to read the next video frame, and return to execute the above process until the tracking process ends.

以上所述仅是对本发明的优选实施例及原理进行了详细说明,对本领域的普通技术人员而言,依据本发明提供的思想,在具体实施方式上会有改变之处,而这些改变也应视为本发明的保护范围。The above is only a detailed description of the preferred embodiments and principles of the present invention. For those of ordinary skill in the art, according to the ideas provided by the present invention, there will be changes in the specific implementation, and these changes should also be It is regarded as the protection scope of the present invention.

Claims (10)

1.一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,包括以下步骤:1. a nuclear correlation filter tracking method for severely occluded scenes, is characterized in that, comprises the following steps: 步骤1、初始化KCF滤波器的KCF算法和Kalman滤波器的Kalman滤波算法,以第一帧的目标位置作为KCF算法和Kalman滤波算法的初始位置;Step 1, initialize the KCF algorithm of the KCF filter and the Kalman filter algorithm of the Kalman filter, take the target position of the first frame as the initial position of the KCF algorithm and the Kalman filter algorithm; 步骤2、使用遮挡检测机制判断当前图像中是否发生目标严重遮挡;若否,则采用KCF算法的跟踪结果作为目标物体在当前图像中的位置;若是,则采用Kalman滤波算法的跟踪结果作为目标物体在当前图像中的位置;Step 2. Use the occlusion detection mechanism to determine whether the target is severely occluded in the current image; if not, use the tracking result of the KCF algorithm as the position of the target object in the current image; if so, use the tracking result of the Kalman filtering algorithm as the target object position in the current image; 步骤3、根据各自的跟踪结果分别对Kalman滤波器和KCF滤波器进行自适应更新;Step 3, according to the respective tracking results, adaptively update the Kalman filter and the KCF filter; 步骤4、读取下一个视频帧,返回执行步骤2;直至跟踪过程结束。Step 4: Read the next video frame, and return to step 2 until the tracking process ends. 2.根据权利要求1所述的一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,所述步骤2中,采用联合指标遮挡检测机制处理图像,以判断当前图像中目标是否发生严重遮挡。2. A kind of kernel correlation filtering tracking method oriented to severe occlusion scene according to claim 1, it is characterized in that, in described step 2, adopt joint index occlusion detection mechanism to process image, to judge whether the target in the current image is serious or not. occlude. 3.根据权利要求1所述的一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,所述联合指标遮挡检测机制为:3. A kind of kernel correlation filter tracking method for severe occlusion scene according to claim 1, is characterized in that, described joint index occlusion detection mechanism is: (1)基于融合特征的相似性判断(1) Similarity judgment based on fusion features 特征融合指标f使用HSV和LBP特征进行构建,包括:The feature fusion index f is constructed using HSV and LBP features, including: 基于KCF算法的检测区域,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的特征向量为x*Based on the detection area of the KCF algorithm, the HSV and LBP features of the target and background areas are extracted respectively, and the extracted (HSV target , HSV background ) and (LBP target , LBP background ) are normalized using the following formulas. The processed feature vector is x=(x 1 , x 2 ,..., x n-1 , x n ) T , and the normalized feature vector is x * :
Figure FDA0003768467170000011
Figure FDA0003768467170000011
Figure FDA0003768467170000021
Figure FDA0003768467170000021
在得到归一化的特征向量后,分别计算HSVtarget与HSVbackground的欧氏距离DHSV,LBPtarget与LBPbackground的欧氏距离DLBPAfter obtaining the normalized eigenvectors, calculate the Euclidean distance D HSV between the HSV target and the HSV background , and the Euclidean distance D LBP between the LBP target and the LBP background :
Figure FDA0003768467170000022
Figure FDA0003768467170000022
Figure FDA0003768467170000023
Figure FDA0003768467170000023
其中,
Figure FDA0003768467170000024
表示目标的HSV特征向量的第i个分量值,
Figure FDA0003768467170000025
表示目标的LBP特征向量的第i个分量值,
Figure FDA0003768467170000026
表示背景区域的HSV特征向量的第i个分量值,
Figure FDA0003768467170000027
表示背景区域的LBP特征向量的第i个分量值;
in,
Figure FDA0003768467170000024
represents the i-th component value of the HSV feature vector of the target,
Figure FDA0003768467170000025
represents the ith component value of the LBP feature vector of the target,
Figure FDA0003768467170000026
represents the i-th component value of the HSV feature vector of the background region,
Figure FDA0003768467170000027
represents the ith component value of the LBP feature vector of the background region;
指标D用于反应在此特征下,目标和背景之间的区分程度;Index D is used to reflect the degree of distinction between the target and the background under this feature; HSV和LBP在融合特征中的权值计算方式如下:The weights of HSV and LBP in the fusion feature are calculated as follows:
Figure FDA0003768467170000028
Figure FDA0003768467170000028
γHSV=1-γLBP γ HSV = 1 - γ LBP 通过提取相邻两帧目标特征的HSV和LBP特征,计算得到相第k帧目标的特征相似性
Figure FDA0003768467170000029
Figure FDA00037684671700000210
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame is calculated.
Figure FDA0003768467170000029
and
Figure FDA00037684671700000210
Figure FDA00037684671700000211
Figure FDA00037684671700000211
Figure FDA00037684671700000212
Figure FDA00037684671700000212
其中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征;Among them, HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the kth frame. The LBP feature of the target in the k-1th frame; 通过权重γLBP和γHSV进行加权求和,得到特征融合f:The feature fusion f is obtained by weighted summation of the weights γ LBP and γ HSV :
Figure FDA0003768467170000031
Figure FDA0003768467170000031
利用融合特征f进行相似性判断:当执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Use fusion feature f for similarity judgment: when the Nth frame is executed, first calculate the average similarity distance s using the obtained historical similarity distance data, and calculate the corresponding threshold th 1 , the calculation formula is as follows:
Figure FDA0003768467170000032
Figure FDA0003768467170000032
th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2) 其中,δ1表示平均相似性距离s的阈值系数;Among them, δ 1 represents the threshold coefficient of the average similarity distance s; 然后计算第N-1与N帧的目标融合特征的欧式距离fN,并判断欧式距离fN是否大于th1;若否,则并未发生遮挡;若是,则可能发生遮挡;Then calculate the Euclidean distance f N of the target fusion features of the N-1th and N frames, and judge whether the Euclidean distance f N is greater than th 1 ; if not, no occlusion has occurred; if so, occlusion may occur; 记是否发生遮挡的指标为ε1,ε1表示为:The indicator of whether occlusion occurs is ε 1 , and ε 1 is expressed as:
Figure FDA0003768467170000033
Figure FDA0003768467170000033
(2)基于最大响应值的跟踪效果判断(2) Judgment of tracking effect based on maximum response value 使用最大响应值Fmax进行跟踪效果判断,包括:当执行到第N帧时,首先,统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect, including: when the execution reaches the Nth frame, first, count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , m and The calculation formula of th 2 is as follows:
Figure FDA0003768467170000034
Figure FDA0003768467170000034
th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1) 其中,
Figure FDA0003768467170000035
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数;
in,
Figure FDA0003768467170000035
represents the maximum response value of the i-th frame, and δ 2 represents the threshold coefficient of the average maximum response response value m;
然后,判断第N帧的最大响应值
Figure FDA0003768467170000036
是否大于th2;若是,则当前并未发生遮挡;若否,则可能发生了遮挡;
Then, determine the maximum response value of the Nth frame
Figure FDA0003768467170000036
Whether it is greater than th 2 ; if so, no occlusion has occurred; if not, occlusion may have occurred;
记是否发生遮挡的指标为ε2,ε2表示为:The indicator of whether occlusion occurs is ε 2 , and ε 2 is expressed as:
Figure FDA0003768467170000041
Figure FDA0003768467170000041
(3)基于平均相关峰值比的评估机制(3) Evaluation mechanism based on average correlation peak ratio 使用平均相关峰值比APCE进行跟踪可信度评估,包括:在响应矩阵基础上,利用平均相关峰值比评估核相关滤波跟踪方法的跟踪效果;APCE计算公式如下:Using the average correlation peak ratio APCE to evaluate the tracking reliability, including: on the basis of the response matrix, using the average correlation peak ratio to evaluate the tracking effect of the nuclear correlation filter tracking method; the APCE calculation formula is as follows:
Figure FDA0003768467170000042
Figure FDA0003768467170000042
其中,Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值;Among them, F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame; 计算前N-1帧的平均
Figure FDA0003768467170000043
和阈值th3,用于评估当前帧,即第N帧的跟踪效果,计算公式如下:
Calculate the average of the first N-1 frames
Figure FDA0003768467170000043
and the threshold th 3 , which are used to evaluate the tracking effect of the current frame, that is, the Nth frame. The calculation formula is as follows:
Figure FDA0003768467170000044
Figure FDA0003768467170000044
Figure FDA0003768467170000045
Figure FDA0003768467170000045
其中,δ3表示平均相关峰值比
Figure FDA0003768467170000046
的阈值系数;
where δ 3 represents the average correlation peak ratio
Figure FDA0003768467170000046
The threshold coefficient of ;
判断
Figure FDA0003768467170000047
是否大于th3;若是,则未发生遮挡;若否,则可能发生遮挡;
judge
Figure FDA0003768467170000047
Whether it is greater than th 3 ; if so, no occlusion occurs; if not, occlusion may occur;
记是否发生遮挡的指标为ε3,ε3表示为:The indicator of whether occlusion occurs is ε 3 , and ε 3 is expressed as:
Figure FDA0003768467170000048
Figure FDA0003768467170000048
构建联合指标∈为:The joint index ∈ is constructed as: ∈=ε123 ∈=ε 123 当∈=0或1时,表示目标未被遮挡或轻微遮挡,使用KCF算法的结果作为跟踪结果;When ∈=0 or 1, it means that the target is not occluded or slightly occluded, and the result of the KCF algorithm is used as the tracking result; 当∈=2或3时,表示目标被严重遮挡,使用Kalman滤波算法的结果作为跟踪结果。When ∈=2 or 3, it means that the target is severely occluded, and the result of the Kalman filtering algorithm is used as the tracking result.
4.根据权利要求3所述的一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,所述步骤3,包括:4. A kind of kernel correlation filter tracking method for severely occluded scene according to claim 3, is characterized in that, described step 3, comprises: 若使用KCF算法的结果作为跟踪结果,则进行KCF滤波器更新,同时还将KCF算法的结果作为Kalman测量值对Kalman滤波器进行更新;If the result of the KCF algorithm is used as the tracking result, the KCF filter is updated, and the Kalman filter is updated with the result of the KCF algorithm as the Kalman measurement value; 若使用Kalman滤波算法的结果作为跟踪结果,则对Kalman滤波器进行更新,KCF滤波器停止更新。If the result of the Kalman filter algorithm is used as the tracking result, the Kalman filter is updated, and the KCF filter is stopped. 5.根据权利要求4所述的一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,所述KCF滤波器的更新,包括:5. a kind of kernel correlation filter tracking method for serious occlusion scene according to claim 4, is characterized in that, the update of described KCF filter, comprises: αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1 xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1
Figure FDA0003768467170000051
Figure FDA0003768467170000051
其中,ρ表示KCF滤波器参数的更新系数;αt-1和αt表示第t-1帧和第t帧的KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器选用的目标模型的参数。Among them, ρ represents the update coefficient of the KCF filter parameters; αt -1 and αt represent the KCF filter coefficients of the t-1th frame and the tth frame, and xt-1 and xt represent the t-1th frame and the t- th frame. The parameters of the target model selected by the KCF filter of the t frame.
6.一种面向严重遮挡场景的核相关滤波跟踪系统,其特征在于,包括:6. A kernel correlation filter tracking system for severely occluded scenes, characterized in that it comprises: 初始化模块,用于初始化KCF滤波器的KCF算法和Kalman滤波器的Kalman滤波算法,以第一帧的目标位置作为KCF算法和Kalman滤波算法的初始位置;The initialization module is used to initialize the KCF algorithm of the KCF filter and the Kalman filter algorithm of the Kalman filter, and the target position of the first frame is used as the initial position of the KCF algorithm and the Kalman filter algorithm; 判断模块,用于使用遮挡检测机制判断当前图像中是否发生目标严重遮挡;若否,则采用KCF算法的跟踪结果作为目标物体在当前图像中的位置;若是,则采用Kalman滤波算法的跟踪结果作为目标物体在当前图像中的位置;The judgment module is used to use the occlusion detection mechanism to judge whether the target is severely occluded in the current image; if not, the tracking result of the KCF algorithm is used as the position of the target object in the current image; if so, the tracking result of the Kalman filtering algorithm is used as the The position of the target object in the current image; 更新模块,用于根据各自的跟踪结果分别对Kalman滤波器和KCF滤波器进行自适应更新;The update module is used to adaptively update the Kalman filter and the KCF filter according to the respective tracking results; 执行模块,用于读取下一个视频帧,返回执行步骤2;直至跟踪过程结束。The execution module is used for reading the next video frame, and returns to step 2 until the tracking process ends. 7.根据权利要求6所述的一种面向严重遮挡场景的核相关滤波跟踪系统,其特征在于,所述判断模块采用联合指标遮挡检测机制处理图像,以判断当前图像中目标是否发生严重遮挡。7 . The kernel correlation filtering tracking system for severely occluded scenes according to claim 6 , wherein the judging module uses a joint index occlusion detection mechanism to process the image to judge whether the target in the current image is severely occluded. 8 . 8.根据权利要求7所述的一种面向严重遮挡场景的核相关滤波跟踪方法,其特征在于,所述联合指标遮挡检测机制为:8. A kind of kernel correlation filter tracking method for severe occlusion scene according to claim 7, it is characterized in that, described joint index occlusion detection mechanism is: (1)基于融合特征的相似性判断(1) Similarity judgment based on fusion features 特征融合指标f使用HSV和LBP特征进行构建,包括:The feature fusion index f is constructed using HSV and LBP features, including: 基于KCF算法的检测区域,分别提取目标和背景区域的HSV和LBP特征,并对提取的(HSVtarget,HSVbackground)和(LBPtarget,LBPbackground)使用下述公式进行归一化处理,记待处理的特征向量为x=(x1,x2,...,xn-1,xn)T,归一化后的特征向量为x*Based on the detection area of the KCF algorithm, the HSV and LBP features of the target and background areas are extracted respectively, and the extracted (HSV target , HSV background ) and (LBP target , LBP background ) are normalized using the following formulas. The processed feature vector is x=(x 1 , x 2 ,..., x n-1 , x n ) T , and the normalized feature vector is x * :
Figure FDA0003768467170000061
Figure FDA0003768467170000061
Figure FDA0003768467170000062
Figure FDA0003768467170000062
在得到归一化的特征向量后,分别计算HSVtarget与HSVbackground的欧氏距离DHSV,LBPtarget与LBPbackground的欧氏距离DLBPAfter obtaining the normalized eigenvectors, calculate the Euclidean distance D HSV between the HSV target and the HSV background , and the Euclidean distance D LBP between the LBP target and the LBP background :
Figure FDA0003768467170000063
Figure FDA0003768467170000063
Figure FDA0003768467170000071
Figure FDA0003768467170000071
其中,
Figure FDA0003768467170000072
表示目标的HSV特征向量的第i个分量值,
Figure FDA0003768467170000073
表示目标的LBP特征向量的第i个分量值,
Figure FDA0003768467170000074
表示背景区域的HSV特征向量的第i个分量值,
Figure FDA0003768467170000075
表示背景区域的LBP特征向量的第i个分量值;
in,
Figure FDA0003768467170000072
represents the i-th component value of the HSV feature vector of the target,
Figure FDA0003768467170000073
represents the ith component value of the LBP feature vector of the target,
Figure FDA0003768467170000074
represents the i-th component value of the HSV feature vector of the background region,
Figure FDA0003768467170000075
represents the ith component value of the LBP feature vector of the background region;
指标D用于反应在此特征下,目标和背景之间的区分程度;Index D is used to reflect the degree of distinction between the target and the background under this feature; HSV和LBP在融合特征中的权值计算方式如下:The weights of HSV and LBP in the fusion feature are calculated as follows:
Figure FDA0003768467170000076
Figure FDA0003768467170000076
γHSV=1-γLBP γ HSV = 1 - γ LBP 通过提取相邻两帧目标特征的HSV和LBP特征,计算得到相第k帧目标的特征相似性
Figure FDA0003768467170000077
Figure FDA0003768467170000078
By extracting the HSV and LBP features of the target features of two adjacent frames, the feature similarity of the target in the kth frame is calculated.
Figure FDA0003768467170000077
and
Figure FDA0003768467170000078
Figure FDA0003768467170000079
Figure FDA0003768467170000079
Figure FDA00037684671700000710
Figure FDA00037684671700000710
其中,HSVk和HSVk-1表示第k帧和第k-1帧中目标的HSV特征,LBPk和LBPk-1表示第k帧和第k-1帧中目标的LBP特征;Among them, HSV k and HSV k-1 represent the HSV features of the target in the kth frame and the k-1th frame, and LBP k and LBP k-1 represent the kth frame. The LBP feature of the target in the k-1th frame; 通过权重γLBP和γHSV进行加权求和,得到特征融合f:The feature fusion f is obtained by weighted summation of the weights γ LBP and γ HSV :
Figure FDA00037684671700000711
Figure FDA00037684671700000711
利用融合特征f进行相似性判断:当执行到第N帧时,首先利用得到的历史相似性距离数据计算平均相似性距离s,并计算相应的阈值th1,计算公式如下:Use fusion feature f for similarity judgment: when the Nth frame is executed, first calculate the average similarity distance s using the obtained historical similarity distance data, and calculate the corresponding threshold th 1 , the calculation formula is as follows:
Figure FDA00037684671700000712
Figure FDA00037684671700000712
th1=δ1*s(1<δ1<2)th 11 *s (1<δ 1 <2) 其中,δ1表示平均相似性距离s的阈值系数;Among them, δ 1 represents the threshold coefficient of the average similarity distance s; 然后计算第N-1与N帧的目标融合特征的欧式距离fN,并判断欧式距离fN是否大于th1;若否,则并未发生遮挡;若是,则可能发生遮挡;Then calculate the Euclidean distance f N of the target fusion features of the N-1th and N frames, and judge whether the Euclidean distance f N is greater than th 1 ; if not, no occlusion has occurred; if so, occlusion may occur; 记是否发生遮挡的指标为ε1,ε1表示为:The indicator of whether occlusion occurs is ε 1 , and ε 1 is expressed as:
Figure FDA0003768467170000081
Figure FDA0003768467170000081
(2)基于最大响应值的跟踪效果判断(2) Judgment of tracking effect based on maximum response value 使用最大响应值Fmax进行跟踪效果判断,包括:当执行到第N帧时,首先,统计前N-1帧的历史最大响应值信息,计算平均最大响应响应值m和阈值th2,m和th2计算公式如下:Use the maximum response value F max to judge the tracking effect, including: when the execution reaches the Nth frame, first, count the historical maximum response value information of the previous N-1 frames, calculate the average maximum response value m and the threshold th 2 , m and The calculation formula of th 2 is as follows:
Figure FDA0003768467170000082
Figure FDA0003768467170000082
th2=δ2*m(0<δ2<1)th 22 *m (0<δ 2 <1) 其中,
Figure FDA0003768467170000083
表示第i帧的最大响应值,δ2表示平均最大响应响应值m的阈值系数;
in,
Figure FDA0003768467170000083
represents the maximum response value of the i-th frame, and δ 2 represents the threshold coefficient of the average maximum response response value m;
然后,判断第N帧的最大响应值
Figure FDA0003768467170000084
是否大于th2;若是,则当前并未发生遮挡;若否,则可能发生了遮挡;
Then, determine the maximum response value of the Nth frame
Figure FDA0003768467170000084
Whether it is greater than th 2 ; if so, no occlusion has occurred; if not, occlusion may have occurred;
记是否发生遮挡的指标为ε2,ε2表示为:The indicator of whether occlusion occurs is ε 2 , and ε 2 is expressed as:
Figure FDA0003768467170000085
Figure FDA0003768467170000085
(3)基于平均相关峰值比的评估机制(3) Evaluation mechanism based on average correlation peak ratio 使用平均相关峰值比APCE进行跟踪可信度评估,包括:在响应矩阵基础上,利用平均相关峰值比评估核相关滤波跟踪方法的跟踪效果;APCE计算公式如下:Using the average correlation peak ratio APCE to evaluate the tracking reliability, including: on the basis of the response matrix, using the average correlation peak ratio to evaluate the tracking effect of the nuclear correlation filter tracking method; the APCE calculation formula is as follows:
Figure FDA0003768467170000091
Figure FDA0003768467170000091
其中,Fmax表示当前帧的最大响应值,Fmin表示当前帧的最小响应值;Among them, F max represents the maximum response value of the current frame, and F min represents the minimum response value of the current frame; 计算前N-1帧的平均
Figure FDA0003768467170000092
和阈值th3,用于评估当前帧,即第N帧的跟踪效果,计算公式如下:
Calculate the average of the first N-1 frames
Figure FDA0003768467170000092
and the threshold th 3 , which are used to evaluate the tracking effect of the current frame, that is, the Nth frame. The calculation formula is as follows:
Figure FDA0003768467170000093
Figure FDA0003768467170000093
Figure FDA0003768467170000094
Figure FDA0003768467170000094
其中,δ3表示平均相关峰值比
Figure FDA0003768467170000095
的阈值系数;
where δ 3 represents the average correlation peak ratio
Figure FDA0003768467170000095
The threshold coefficient of ;
判断
Figure FDA0003768467170000096
是否大于th3;若是,则未发生遮挡;若否,则可能发生遮挡;
judge
Figure FDA0003768467170000096
Whether it is greater than th 3 ; if so, no occlusion occurs; if not, occlusion may occur;
记是否发生遮挡的指标为ε3,ε3表示为:The indicator of whether occlusion occurs is ε 3 , and ε 3 is expressed as:
Figure FDA0003768467170000097
Figure FDA0003768467170000097
构建联合指标∈为:The joint index ∈ is constructed as: ∈=ε123 ∈=ε 123 当∈=0或1时,表示目标未被遮挡或轻微遮挡,使用KCF算法的结果作为跟踪结果;When ∈=0 or 1, it means that the target is not occluded or slightly occluded, and the result of the KCF algorithm is used as the tracking result; 当∈=2或3时,表示目标被严重遮挡,使用Kalman滤波算法的结果作为跟踪结果。When ∈=2 or 3, it means that the target is severely occluded, and the result of the Kalman filtering algorithm is used as the tracking result.
9.根据权利要求8所述的一种面向严重遮挡场景的核相关滤波跟踪系统,其特征在于,所述更新模块的更新过程,包括:9. A kind of kernel correlation filtering tracking system oriented to severe occlusion scene according to claim 8, is characterized in that, the updating process of described updating module, comprises: 若使用KCF算法的结果作为跟踪结果,则进行KCF滤波器更新,同时还将KCF算法的结果作为Kalman测量值对Kalman滤波器进行更新;If the result of the KCF algorithm is used as the tracking result, the KCF filter is updated, and the Kalman filter is updated with the result of the KCF algorithm as the Kalman measurement value; 若使用Kalman滤波算法的结果作为跟踪结果,则对Kalman滤波器进行更新,KCF滤波器停止更新。If the result of the Kalman filter algorithm is used as the tracking result, the Kalman filter is updated, and the KCF filter is stopped. 10.根据权利要求9所述的一种面向严重遮挡场景的核相关滤波跟踪系统,其特征在于,所述KCF滤波器的更新,包括:10. A kind of kernel correlation filtering tracking system oriented to severe occlusion scene according to claim 9, is characterized in that, the updating of described KCF filter, comprises: αt=θ((1-ρ)αt-1+ραt)+(1-θ)αt-1 α t =θ((1-ρ)α t-1 +ρα t )+(1-θ)α t-1 xt=θ((1-ρ)xt-1+ρxt)+(1-θ)xt-1 x t =θ((1-ρ)x t-1 +ρx t )+(1-θ)x t-1
Figure FDA0003768467170000101
Figure FDA0003768467170000101
其中,ρ表示KCF滤波器参数的更新系数;αt-1和αt表示第t-1帧和第t帧的KCF滤波器系数,xt-1和xt表示第t-1帧和第t帧的KCF滤波器选用的目标模型的参数。Among them, ρ represents the update coefficient of the KCF filter parameters; αt -1 and αt represent the KCF filter coefficients of the t-1th frame and the tth frame, and xt-1 and xt represent the t-1th frame and the t- th frame. The parameters of the target model selected by the KCF filter of the t frame.
CN202210893435.8A 2022-07-27 2022-07-27 A kernel correlation filter target tracking method and system for severely occluded scenes Active CN115239770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210893435.8A CN115239770B (en) 2022-07-27 2022-07-27 A kernel correlation filter target tracking method and system for severely occluded scenes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210893435.8A CN115239770B (en) 2022-07-27 2022-07-27 A kernel correlation filter target tracking method and system for severely occluded scenes

Publications (2)

Publication Number Publication Date
CN115239770A true CN115239770A (en) 2022-10-25
CN115239770B CN115239770B (en) 2025-09-12

Family

ID=83676421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210893435.8A Active CN115239770B (en) 2022-07-27 2022-07-27 A kernel correlation filter target tracking method and system for severely occluded scenes

Country Status (1)

Country Link
CN (1) CN115239770B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115908494A (en) * 2022-11-29 2023-04-04 云南大学 Correlation Filter Tracking Method Based on Reverse Detection Mechanism Adaptive Template Update
CN120339336A (en) * 2025-06-18 2025-07-18 吉林建筑科技学院 Intelligent target tracking data analysis system and method based on DSP

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080181453A1 (en) * 2005-03-17 2008-07-31 Li-Qun Xu Method of Tracking Objects in a Video Sequence
CN112633105A (en) * 2020-12-15 2021-04-09 重庆电子工程职业学院 Target tracking and counting system and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080181453A1 (en) * 2005-03-17 2008-07-31 Li-Qun Xu Method of Tracking Objects in a Video Sequence
CN112633105A (en) * 2020-12-15 2021-04-09 重庆电子工程职业学院 Target tracking and counting system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
居超;黄影平;胡兴;: "一种抗遮挡尺度自适应核相关滤波器跟踪算法", 上海理工大学学报, no. 05, 15 October 2018 (2018-10-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115908494A (en) * 2022-11-29 2023-04-04 云南大学 Correlation Filter Tracking Method Based on Reverse Detection Mechanism Adaptive Template Update
CN120339336A (en) * 2025-06-18 2025-07-18 吉林建筑科技学院 Intelligent target tracking data analysis system and method based on DSP

Also Published As

Publication number Publication date
CN115239770B (en) 2025-09-12

Similar Documents

Publication Publication Date Title
CN110084831B (en) Multi-target detection and tracking method based on YOLOv3 multi-Bernoulli video
CN108010067B (en) A kind of visual target tracking method based on combination determination strategy
CN111127518B (en) Target tracking method and device based on unmanned aerial vehicle
KR100851981B1 (en) Method and apparatus for discriminating real object from video image
JP4964159B2 (en) Computer-implemented method for tracking an object in a sequence of video frames
CN104091349B (en) robust target tracking method based on support vector machine
CN110555870B (en) A Neural Network-based DCF Tracking Confidence Evaluation and Classifier Update Method
CN105405151A (en) Anti-occlusion target tracking method based on particle filtering and weighting Surf
CN106204638A (en) A kind of based on dimension self-adaption with the method for tracking target of taking photo by plane blocking process
CN111476817A (en) A multi-target pedestrian detection and tracking method based on yolov3
CN101853511A (en) An anti-occlusion target trajectory prediction and tracking method
CN101251928A (en) Kernel-Based Object Tracking Method
CN110796679A (en) A target tracking method for aerial imagery
CN110458862A (en) A Tracking Method for Moving Objects in Occluded Background
CN111986225A (en) Multi-target tracking method and device based on angular point detection and twin network
CN102129695A (en) Target tracking method based on modeling of occluder under condition of having occlusion
CN110136171B (en) Method for judging occlusion in target tracking process
CN115239770A (en) A Kernel Correlation Filtering Target Tracking Method and System for Severely Occluded Scenes
CN110276784B (en) Correlation filtering moving target tracking method based on memory mechanism and convolution characteristics
CN111583294A (en) A Target Tracking Method Combining Scale Adaptation and Model Update
CN109448027A (en) A kind of adaptive, lasting motion estimate method based on algorithm fusion
CN110717934A (en) An anti-occlusion target tracking method based on STRCF
CN114092512A (en) Radar target detection and tracking method based on self-adaptive multi-core correlation filtering
CN106887012A (en) A kind of quick self-adapted multiscale target tracking based on circular matrix
CN102663773A (en) Dual-core type adaptive fusion tracking method of video object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant