[go: up one dir, main page]

CN102930558A - Real-time tracking method for infrared image target with multi-feature fusion - Google Patents

Real-time tracking method for infrared image target with multi-feature fusion Download PDF

Info

Publication number
CN102930558A
CN102930558A CN2012103976863A CN201210397686A CN102930558A CN 102930558 A CN102930558 A CN 102930558A CN 2012103976863 A CN2012103976863 A CN 2012103976863A CN 201210397686 A CN201210397686 A CN 201210397686A CN 102930558 A CN102930558 A CN 102930558A
Authority
CN
China
Prior art keywords
feature
prime
rho
target
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012103976863A
Other languages
Chinese (zh)
Other versions
CN102930558B (en
Inventor
白俊奇
赵春光
王寿峰
翟尚礼
汪洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 28 Research Institute
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN201210397686.3A priority Critical patent/CN102930558B/en
Publication of CN102930558A publication Critical patent/CN102930558A/en
Application granted granted Critical
Publication of CN102930558B publication Critical patent/CN102930558B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明公开了一种多特征融合的红外图像目标实时跟踪方法,包括以下步骤:初始化目标跟踪点位置;初始化目标模型;计算目标候选模型;计算的联合特征Bhattacharyya系数及特征间权重系数;计算当前帧目标跟踪新位置;估计新位置的联合特征Bhattacharyya系数;比较前后两个联合特征Bhattacharyya系数,并输出结果。本发明自适应计算多特征间的权重系数,增强了目标跟踪鲁棒性,保证了目标跟踪稳定性,解决了单一特征不稳定造成的跟踪点漂移问题,有效提高了目标跟踪精度。

Figure 201210397686

The invention discloses a multi-feature fusion infrared image target real-time tracking method, comprising the following steps: initializing the target tracking point position; initializing the target model; calculating the target candidate model; calculating the joint feature Bhattacharyya coefficient and the weight coefficient between features; The frame target tracks the new position; estimates the joint feature Bhattacharyya coefficient of the new position; compares the two joint feature Bhattacharyya coefficients before and after, and outputs the result. The invention self-adaptively calculates the weight coefficient among multiple features, enhances the robustness of target tracking, ensures the stability of target tracking, solves the problem of tracking point drift caused by the instability of a single feature, and effectively improves the target tracking accuracy.

Figure 201210397686

Description

一种多特征融合的红外图像目标实时跟踪方法A real-time tracking method of infrared image targets based on multi-feature fusion

技术领域technical field

本发明设计一种多特征融合的红外图像目标跟踪方法,特别涉及一种适合硬件实时实现的红外图像目标跟踪方法。The invention designs a multi-feature fusion infrared image target tracking method, and in particular relates to an infrared image target tracking method suitable for hardware real-time implementation.

背景技术Background technique

近年来,随着集成电路工艺和红外材料的发展,红外成像技术取得了很大进步,在国防建设和国民经济领域得到广泛应用。然而,与可见光图像相比,红外图像信噪比相对较低,因此在进行红外图像目标检测和跟踪时只能提供有限的信息。由于红外图像中目标特征不明显,存在大的背景杂波等问题,导致红外图像目标的精确跟踪变得更加困难。In recent years, with the development of integrated circuit technology and infrared materials, infrared imaging technology has made great progress and has been widely used in national defense construction and national economy. However, compared with visible light images, the signal-to-noise ratio of infrared images is relatively low, so only limited information can be provided for object detection and tracking in infrared images. Due to the inconspicuous target features and large background clutter in infrared images, it is more difficult to accurately track targets in infrared images.

目前,目标跟踪算法分为基于模型的跟踪方法和基于外观的跟踪方法两大类。与模型跟踪法相比,外观跟踪法避免了建立模型的复杂过程,具有更广的工程实用价值。其中,均值漂移跟踪算法因其简单、鲁棒、实时性好的特点在目标跟踪中得到广泛应用。均值漂移是一种无参密度计算方法,通过多次迭代搜索与样本分布最相似的分布模式。Comaniciu等人通过寻找目标颜色直方图与候选目标颜色直方图相似度的极大值,提出一种均值漂移目标跟踪算法。Chu等人将Kalman滤波器用于预测Mean Shift的初始迭代位置,但是当目标被严重遮挡时,由于Mean Shift算法寻找到的目标位置点不准确,存在一定偏差。Collins等提出一种能选取易辨识颜色特征的自适应跟踪方法,其中候选颜色特征集包含由像素点R、G、B值线性组合计算得到的49组特征.由于采用的候选集较大,特征选取的运算开销也很大。因此,现有的目标跟踪算法存在以下缺点:(1)经典的目标跟踪算法采用单一特征描述目标,抗干扰能力差;(2)多数现有的多特征目标跟踪算法仅仅利用当前帧计算特征间的权重系数,当目标进行复杂变化,跟踪算法鲁棒性差;(3)多数现有的跟踪算法在目标出现非刚性变形、局部遮挡及交叠的情况下,跟踪精度下降,甚至出现目标丢失;(4)多数现有的跟踪算法在提高目标跟踪精度的同时,大大增加了算法复杂度,不易硬件实时实现。At present, object tracking algorithms are divided into two categories: model-based tracking methods and appearance-based tracking methods. Compared with the model tracking method, the appearance tracking method avoids the complicated process of building a model, and has wider engineering practical value. Among them, the mean shift tracking algorithm is widely used in target tracking because of its simplicity, robustness and good real-time performance. Mean shift is a non-parametric density calculation method that searches for the distribution pattern most similar to the sample distribution through multiple iterations. Comaniciu et al. proposed a mean shift target tracking algorithm by finding the maximum value of the similarity between the target color histogram and the candidate target color histogram. Chu et al. used the Kalman filter to predict the initial iterative position of Mean Shift, but when the target is severely occluded, there is a certain deviation due to the inaccurate target position found by the Mean Shift algorithm. Collins et al. proposed an adaptive tracking method that can select easily identifiable color features, in which the candidate color feature set includes 49 groups of features calculated by linear combination of pixel R, G, and B values. Due to the large candidate set used, the computational overhead of feature selection is also very large. Therefore, the existing target tracking algorithms have the following disadvantages: (1) The classic target tracking algorithm uses a single feature to describe the target, which has poor anti-interference ability; (2) Most existing multi-feature target tracking algorithms only use the current frame to calculate When the target undergoes complex changes, the robustness of the tracking algorithm is poor; (3) Most of the existing tracking algorithms, when the target has non-rigid deformation, partial occlusion and overlap, the tracking accuracy decreases, and even the target is lost; (4) Most of the existing tracking algorithms greatly increase the complexity of the algorithm while improving the target tracking accuracy, which is not easy to realize in real time by hardware.

发明内容Contents of the invention

发明目的:本发明所要解决的技术问题是针对现有技术的不足,提供一种多特征融合的红外图像目标实时跟踪方法。Purpose of the invention: The technical problem to be solved by the present invention is to provide a multi-feature fusion infrared image target real-time tracking method for the deficiencies of the prior art.

为了解决上述技术问题,本发明公开了一种多特征融合的红外图像目标实时跟踪方法,包括以下步骤:In order to solve the above technical problems, the present invention discloses a multi-feature fusion infrared image target real-time tracking method, comprising the following steps:

(1)初始化目标跟踪点位置y0。初始跟踪点由人工指定;(1) Initialize the target tracking point position y 0 . The initial tracking point is manually specified;

(2)初始化目标模型,以初始跟踪点y0为中心建立目标灰度模型q1和目标LBP纹理模型q2;(local binary pattern,LBP)局部二值模式。(2) Initialize the target model, and establish the target grayscale model q 1 and the target LBP texture model q 2 with the initial tracking point y 0 as the center; (local binary pattern, LBP) local binary pattern.

(3)计算目标候选模型,根据目标的跟踪点位置y0,计算候选目标灰度模型p1(y0)和候选目标LBP纹理模型p2(y0);(3) Calculate the target candidate model, and calculate the candidate target grayscale model p 1 (y 0 ) and the candidate target LBP texture model p 2 (y 0 ) according to the target tracking point position y 0 ;

(4)利用灰度特征Bhattacharyya(巴塔查里亚Bhattacharyya,参见Visual C++数字图像处理,第466页,作者:谢凤英,2008年第一版,电子工业出版社。)系数ρ1和LBP纹理特征的Bhattacharyya系数ρ2,以及灰度特征的权重系数α1和LBP纹理特征的权重系数α2,计算位置y0处联合特征Bhattacharyya系数ρ,表达式如下:(4) Using the grayscale feature Bhattacharyya (Bhattacharya Bhattacharyya, see Visual C++ digital image processing, page 466, author: Xie Fengying, first edition in 2008, Electronic Industry Press.) Coefficient ρ 1 and LBP texture feature The Bhattacharyya coefficient ρ 2 of , and the weight coefficient α 1 of the gray feature and the weight coefficient α 2 of the LBP texture feature, calculate the Bhattacharyya coefficient ρ of the joint feature at the position y 0 , the expression is as follows:

ρ=α1·ρ12·ρ2ρ=α 1 ·ρ 12 ·ρ 2 ;

(5)计算当前帧目标新位置y1(5) Calculate the new position y 1 of the target in the current frame;

(6)利用灰度特征Bhattacharyya系数ρ′1和LBP纹理特征Bhattacharyya系数ρ′2,以及灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2,计算位置y1处联合特征Bhattacharyya系数ρ',表达式如下;(6) Using the gray-scale feature Bhattacharyya coefficient ρ′ 1 and the LBP texture feature Bhattacharyya coefficient ρ′ 2 , and the weight coefficient α′ 1 of the gray-scale feature and the weight coefficient α′ 2 of the LBP texture feature, calculate the joint feature at position y 1 Bhattacharyya coefficient ρ', the expression is as follows;

ρ=α′1·ρ′1+α′2·ρ′2ρ=α′ 1 ·ρ′ 1 +α′ 2 ·ρ′ 2 ,

(7)当ρ'<ρ时,

Figure BDA00002272056400021
否则y1保持不变;(7) When ρ'<ρ,
Figure BDA00002272056400021
Otherwise y1 remains unchanged;

(8)若|(y0-y1)|<ε,停止计算,否则,将y1赋值给y0并执行步骤(3),其中,ε是误差常系数。(8) If |(y 0 -y 1 )|<ε, stop the calculation, otherwise, assign y 1 to y 0 and execute step (3), where ε is the constant error coefficient.

步骤(2)中,目标灰度模型q1为:In step (2), the target grayscale model q 1 is:

qq 11 == {{ qq uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 qq uu == 11 ,,

目标LBP纹理模型q2为:The target LBP texture model q 2 is:

qq 22 == {{ qq vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 qq vv == 11 ,,

其中,qu为目标灰度模型灰度特征的各级概率密度,qv为LBP纹理特征的各级概率密度,m1为目标灰度模型灰度特征的最大量化级数范围,m2为LBP纹理特征的最大量化级数范围,u表示灰度量化级数,v表示纹理量化级数。Among them, q u is the probability density of each level of gray-scale features of the target gray-scale model, q v is the probability density of each level of LBP texture features, m 1 is the maximum quantization series range of gray-scale features of the target gray-scale model, and m 2 is The maximum quantization series range of LBP texture features, u indicates the gray quantization series, and v indicates the texture quantization series.

步骤(3)中,候选目标灰度模型p1为:In step (3), the candidate target gray model p 1 is:

pp 11 == {{ pp uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 pp uu == 11 ,,

候选目标LBP纹理模型p2为:The candidate target LBP texture model p 2 is:

pp 22 == {{ pp vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 pp vv == 11 ,,

pu为目标灰度模型灰度特征的各级概率密度,pv为目标灰度模型灰度特征和LBP纹理特征的各级概率密度,m1为目标灰度模型灰度特征的最大量化级数范围,m2为LBP纹理特征的最大量化级数范围,u表示灰度量化级数,v表示纹理量化级数。p u is the probability density of all levels of gray features of the target gray scale model, p v is the probability density of all levels of gray features of the target gray scale model and LBP texture features, m 1 is the maximum quantization level of the gray scale features of the target gray scale model The number range, m 2 is the maximum quantization series range of LBP texture features, u indicates the gray quantization series, and v indicates the texture quantization series.

步骤(4)中灰度特征的权重系数α1和LBP纹理特征的权重系数α2,步骤(6)中灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2采用迭代方式更新,计算式分别如下:In step (4), the weight coefficient α 1 of the gray-scale feature and the weight coefficient α 2 of the LBP texture feature, in step (6), the weight coefficient α′ 1 of the gray-scale feature and the weight coefficient α′ 2 of the LBP texture feature adopt an iterative method Update, the calculation formulas are as follows:

α1=(1-λ)·α1,old+λ·α1,curα 1 =(1-λ)·α 1,old +λ·α 1,cur ,

α2=(1-λ)·α2,old+λ·α2,curα 2 =(1-λ)·α 2,old +λ·α 2,cur ,

α′1=(1-λ)·α′1,old+λ·α′1,curα′ 1 =(1-λ)·α′ 1,old +λ·α′ 1,cur ,

α′2=(1-λ)·α′2,old+λ·α′2,curα′ 2 =(1-λ)·α′ 2,old +λ·α′ 2,cur ,

其中,α1,old和α2,old分别是步骤(4)中上一帧灰度特征和LBP纹理特征的权重系数,α1,cur和α2,cur分别是步骤(4)中当前帧灰度特征和LBP纹理特征的权重系数,α′1,old和α′2,old分别是步骤(6)中上一帧灰度特征和LBP纹理特征的权重系数,α′1,cur和α′2,cur是步骤(6)中当前帧灰度特征和LBP纹理特征的权重系数,λ是比例系数。0≤λ≤1,决定权重系数的收敛速度,λ值越大,收敛速度越快,跟踪机动性越强,λ值越小,收敛速度越慢,跟踪稳定性越好。Among them, α 1,old and α 2,old are the weight coefficients of the grayscale feature and LBP texture feature of the previous frame in step (4) respectively, α 1,cur and α 2,cur are the current frame in step (4) The weight coefficients of grayscale features and LBP texture features, α′ 1, old and α′ 2, old are the weight coefficients of the previous frame grayscale features and LBP texture features in step (6), α′ 1, cur and α ′ 2, cur is the weight coefficient of the current frame grayscale feature and LBP texture feature in step (6), and λ is the scale coefficient. 0≤λ≤1, determines the convergence speed of the weight coefficient. The larger the value of λ, the faster the convergence speed and the stronger the tracking maneuverability. The smaller the λ value, the slower the convergence speed and the better the tracking stability.

步骤(4)中当前帧灰度特征的权重系数α1,cur和LBP纹理特征的权重系数α2,cur,步骤(6)中当前帧灰度特征的权重系数α′1,cur和LBP纹理特征的权重系数α′2,cur,计算式分别如下:The weight coefficient α 1 of the current frame grayscale feature in step (4), cur and the weight coefficient α 2, cur of the LBP texture feature, and the weight coefficient α′ 1 of the current frame grayscale feature in step (6), cur and the LBP texture The weight coefficient α′ 2 of the feature, cur , is calculated as follows:

&alpha;&alpha; 11 ,, curcur == &rho;&rho; 11 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

&alpha;&alpha; 22 ,, curcur == &rho;&rho; 22 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

&alpha;&alpha; 11 ,, curcur &prime;&prime; == &rho;&rho; 11 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&Center Dot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; ,,

&alpha;&alpha; 22 ,, curcur &prime;&prime; == &rho;&rho; 22 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&Center Dot; &rho;&rho; 22 &prime;&prime; ,,

其中,ρ1是步骤(4)中灰度特征的Bhattacharyya系数,ρ2是步骤(4)中LBP纹理特征的Bhattacharyya系数,ρ′1是步骤(6)中灰度特征的Bhattacharyya系数,ρ′2是步骤(6)中LBP纹理特征的Bhattacharyya系数。Among them, ρ1 is the Bhattacharyya coefficient of the grayscale feature in step (4), ρ2 is the Bhattacharyya coefficient of the LBP texture feature in step (4), ρ′1 is the Bhattacharyya coefficient of the grayscale feature in step (6), ρ′ 2 is the Bhattacharyya coefficient of the LBP texture feature in step (6).

步骤(6)中灰度特征Bhattacharyya系数ρ′1和LBP纹理特征的Bhattacharyya系数ρ′2,以及灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2通过以下公式得到:In step (6), the gray-scale feature Bhattacharyya coefficient ρ′ 1 and the Bhattacharyya coefficient ρ′ 2 of the LBP texture feature, as well as the weight coefficient α′ 1 of the gray-scale feature and the weight coefficient α′ 2 of the LBP texture feature are obtained by the following formula:

&rho;&rho; 11 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 11 pp uu &prime;&prime; &CenterDot;&CenterDot; qq uu ,,

&rho;&rho; 22 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 22 pp vv &prime;&prime; &CenterDot;&Center Dot; qq vv ,,

p′u是位置y1处的目标灰度模型灰度特征的各级概率密度,p′v是位置y1处LBP纹理特征的各级概率密度; p'u is the probability density of all levels of the grayscale features of the target grayscale model at position y1 , and p'v is the probability density of all levels of LBP texture features at position y1 ;

&alpha;&alpha; 11 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&CenterDot; &alpha;&alpha; 11 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&Center Dot; &rho;&rho; 11 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime;

&alpha;&alpha; 22 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 22 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&Center Dot; &rho;&rho; 22 &prime;&prime; ,,

其中,α′1,old是上一帧灰度特征的权重系数,α′2,old是上一帧LBP纹理特征的权重系数,λ是比例系数。0≤λ≤1,决定权重系数的收敛速度,λ值越大,收敛速度越快,跟踪机动性越强,λ值越小,收敛速度越慢,跟踪稳定性越好。Among them, α′ 1,old is the weight coefficient of the grayscale feature of the previous frame, α′ 2,old is the weight coefficient of the LBP texture feature of the previous frame, and λ is the scale coefficient. 0≤λ≤1, determines the convergence speed of the weight coefficient. The larger the value of λ, the faster the convergence speed and the stronger the tracking maneuverability. The smaller the λ value, the slower the convergence speed and the better the tracking stability.

本发明多特征融合的红外图像目标实时跟踪方法中,使用依潘涅切科夫(Epanechnikov)核函数计算灰度特征概率直方图和LBP纹理特征概率直方图。In the multi-feature fusion infrared image target real-time tracking method of the present invention, the gray feature probability histogram and the LBP texture feature probability histogram are calculated using the Epanechnikov kernel function.

本发明与现有技术相比,具有以下显著优点:(1)根据目标与背景的特征显著性和相似性,自适应计算多特征间的权重系数,增强了目标跟踪鲁棒性;(2)采用迭代方式更新多特征间的权重系数,保证了目标跟踪稳定性;(3)利用多特征融合目标跟踪方法对红外图像目标进行跟踪,解决了单一特征不稳定造成的跟踪点漂移问题,有效提高了目标跟踪精度;(4)本发明提出的多特征融合目标跟踪方法不存在高阶运算和复杂结构,算法运算量小,易于硬件实时实现。Compared with the prior art, the present invention has the following significant advantages: (1) According to the feature saliency and similarity of the target and the background, the weight coefficient between multiple features is adaptively calculated, which enhances the robustness of target tracking; (2) The weight coefficient between multiple features is updated in an iterative manner to ensure the stability of target tracking; (3) The multi-feature fusion target tracking method is used to track infrared image targets, which solves the problem of tracking point drift caused by single feature instability and effectively improves (4) The multi-feature fusion target tracking method proposed by the present invention does not have high-order operations and complex structures, the algorithm has a small amount of calculation, and is easy to realize in real time by hardware.

附图说明Description of drawings

下面结合附图和具体实施方式对本发明做更进一步的具体说明,本发明的上述和其他方面的优点将会变得更加清楚。The advantages of the above and other aspects of the present invention will become clearer as the present invention will be further described in conjunction with the accompanying drawings and specific embodiments.

图1为本发明流程图。Fig. 1 is the flow chart of the present invention.

图2a~2d为传统的单特征(灰度)红外图像目标跟踪结果。Figures 2a~2d are the traditional single-feature (grayscale) infrared image target tracking results.

图3a~3d为本发明多特征融合的红外图像目标跟踪结果。Figures 3a to 3d are the tracking results of the infrared image target based on the multi-feature fusion of the present invention.

具体实施方式Detailed ways

本发明多特征融合的红外图像目标实时跟踪方法中,利用灰度特征和LBP纹理特征描述红外图像目标的特征。In the multi-feature fusion infrared image target real-time tracking method of the present invention, grayscale features and LBP texture features are used to describe the features of the infrared image target.

八邻域LBP纹理特征表达式LBP8,1如下所示:The eight-neighborhood LBP texture feature expression LBP 8, 1 is as follows:

LBLB PP 8,18,1 == &Sigma;&Sigma; nno == 00 77 sthe s (( gg nno -- gg cc )) &CenterDot;&Center Dot; 22 nno ,,

sthe s (( xx )) == 11 ,, xx &GreaterEqual;&Greater Equal; 00 00 ,, xx << 00

其中,gc是当前点,gn是周围邻域点,n=0..7。Among them, g c is the current point, g n is the surrounding neighbor points, n=0..7.

本发明多特征融合的红外图像目标实时跟踪方法中,利用Bhattacharyya系数描述目标模型和目标候选模型间的相似性。In the multi-feature fusion infrared image target real-time tracking method of the present invention, the Bhattacharyya coefficient is used to describe the similarity between the target model and the target candidate model.

Bhattacharyya系数ρBha表达式如下所示:The Bhattacharyya coefficient ρ Bha expression is as follows:

&rho;&rho; BhaBha == &Sigma;&Sigma; uu == 11 mm pp &CenterDot;&Center Dot; qq ,,

其中,p是目标候选模型,q是目标模型。where p is the target candidate model and q is the target model.

本发明多特征融合的红外图像目标实时跟踪方法中,灰度特征的权重系数α1和LBP纹理特征的权重系数α2采用迭代方式更新,表达式如下:In the infrared image target real-time tracking method of multi-feature fusion of the present invention, the weight coefficient α 1 of grayscale feature and the weight coefficient α 2 of LBP texture feature are updated in an iterative manner, and the expressions are as follows:

α1=(1-λ)·α1,old+λ·α1,cur α 1 =(1-λ) α 1,old + λ α 1,cur

α2=(1-λ)·α2,old+λ·α2,cur α 2 =(1-λ)·α 2,old +λ·α 2,cur

其中,α1,old和α2,old分别是上一帧灰度特征和LBP纹理特征的权重系数,α1,cur和α2,cur是当前帧灰度特征和LBP纹理特征的权重系数,λ是比例系数。Among them, α 1, old and α 2, old are the weight coefficients of the grayscale feature and LBP texture feature of the previous frame respectively, α 1, cur and α 2, cur are the weight coefficients of the grayscale feature of the current frame and the LBP texture feature, λ is a proportionality factor.

本发明多特征融合的红外图像目标实时跟踪方法中,当前帧灰度特征的权重系数α1,cur和LBP纹理特征的权重系数α2,cur表达式分别如下:In the infrared image target real-time tracking method of multi-feature fusion of the present invention, the weight coefficient α 1 of current frame grayscale feature, cur and the weight coefficient α 2 of LBP texture feature, cur expression is respectively as follows:

&alpha;&alpha; 11 ,, curcur == &rho;&rho; 11 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

&alpha;&alpha; 22 ,, curcur == &rho;&rho; 22 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

其中,ρ1是灰度特征的Bhattacharyya系数,ρ2是LBP纹理特征的Bhattacharyya系数。Among them, ρ1 is the Bhattacharyya coefficient of the grayscale feature, and ρ2 is the Bhattacharyya coefficient of the LBP texture feature.

实施例1Example 1

如图1所示,下面以实例来说明本发明多特征融合的红外图像目标实时跟踪方法。红外图像的像素个数320×240,帧频25HZ。红外热像仪成像通过光纤传给DSP+FPGA架构的专用图像处理板,多特征融合的红外图像目标跟踪在DSP处理器中实现,满足实时处理的需求,具体实施步骤如下:As shown in FIG. 1 , an example is used below to illustrate the real-time tracking method of an infrared image target based on multi-feature fusion of the present invention. The number of pixels of the infrared image is 320×240, and the frame rate is 25HZ. The imaging of the infrared thermal imager is transmitted to the special image processing board of the DSP+FPGA architecture through the optical fiber, and the infrared image target tracking of multi-feature fusion is realized in the DSP processor to meet the needs of real-time processing. The specific implementation steps are as follows:

(1)初始化目标跟踪点位置y0,初始跟踪点由人工指定。(1) Initialize the position of the target tracking point y 0 , and the initial tracking point is designated manually.

人工指定初始目标跟踪点位置(i,j),i=80,j=100(图2所示),设定依潘涅切科夫(Epanechnikov)核函数带宽h=10。Manually specify the initial target tracking point position (i, j), i=80, j=100 (as shown in Figure 2), and set the Epanechnikov kernel function bandwidth h=10.

(2)初始化目标模型,根据灰度特征建立目标灰度模型q1,计算目标的LBP纹理特征,结合LBP纹理特征建立目标纹理模型q2(2) Initialize the target model, establish the target gray scale model q 1 according to the gray scale features, calculate the target LBP texture features, and combine the LBP texture features to build the target texture model q 2 ;

计算以初始跟踪点位置(80,100)为中心,带宽h=10为范围的图像ILBP的LBP纹理特征,表达式如下所示:Calculate the LBP texture feature of the image I LBP with the initial tracking point position (80,100) as the center and the bandwidth h=10 as the range, the expression is as follows:

II LBPLBP == &Sigma;&Sigma; kk 11 == 7575 8585 &Sigma;&Sigma; kk 22 == 9595 105105 LBLB PP 8,18,1 (( kk 11 ,, kk 22 ))

八邻域LBP纹理特征表达式LBP8,1如下所示:The eight-neighborhood LBP texture feature expression LBP 8,1 is as follows:

LBLB PP 8,18,1 (( kk 11 ,, kk 22 )) == &Sigma;&Sigma; nno == 00 77 sthe s (( gg nno -- gg cc )) &CenterDot;&Center Dot; 22 nno ,,

sthe s (( xx )) == 11 ,, xx &GreaterEqual;&Greater Equal; 00 00 ,, xx << 00 ,,

其中,gc是当前目标点,c=k2*320+k1,gn是gc周围邻域点,n=0..7。Among them, g c is the current target point, c=k2*320+k1, g n is the neighborhood points around g c , n=0..7.

目标灰度模型q1为:The target grayscale model q 1 is:

qq 11 == {{ qq uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 qq uu == 11 ,,

qq uu == CC &Sigma;&Sigma; ii == 11 nno kk [[ || || ythe y 00 -- xx ii hh || || 22 ]] &CenterDot;&Center Dot; &delta;&delta; [[ bb 11 (( xx ii )) -- &mu;&mu; ]] ,,

目标LBP纹理模型q2为:The target LBP texture model q 2 is:

qq 22 == {{ qq vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 qq vv == 11 ,,

qq vv == CC &Sigma;&Sigma; ii == 11 nno &prime;&prime; kk [[ || || ythe y 00 -- xx ii hh || || 22 ]] &CenterDot;&Center Dot; &delta;&delta; [[ bb 22 (( xx ii )) -- vv ]] ,,

其中,qu和qv分别表示目标模型灰度特征和LBP纹理特征的各级概率密度,m1=255和m2=255分别表示目标模型灰度特征和LBP纹理特征的量化级数,函数b1(·)是位于xi的像素向灰度特征索引的映像,函数b2(·)是位于xi的像素向LBP纹理特征索引的映像,δ是Delta函数,C是归一化系数,μ=1...255,v=1...255。Among them, q u and q v represent the probability density of each level of target model gray feature and LBP texture feature respectively, m 1 =255 and m 2 =255 represent the quantization series of target model gray feature and LBP texture feature respectively, the function b 1 (·) is the mapping from the pixel at x i to the gray feature index, the function b 2 (·) is the mapping from the pixel at x i to the LBP texture feature index, δ is the Delta function, and C is the normalization coefficient , μ=1...255, v=1...255.

(3)计算目标候选模型。根据跟踪点位置y0,计算候选目标灰度模型p1(y0)和目标纹理候选模型p2(y0);(3) Calculate target candidate models. According to the tracking point position y 0 , calculate the candidate target grayscale model p 1 (y 0 ) and target texture candidate model p 2 (y 0 );

候选目标灰度模型p1为:The candidate target grayscale model p 1 is:

pp 11 == {{ pp uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 pp uu == 11 ,,

pp uu == CC &Sigma;&Sigma; ii == 11 nno kk [[ || || ythe y 00 -- xx ii hh || || 22 ]] &CenterDot;&Center Dot; &delta;&delta; [[ bb 11 (( xx ii )) -- &mu;&mu; ]] ,,

候选目标LBP纹理模型p2为:The candidate target LBP texture model p 2 is:

pp 22 == {{ pp vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 pp vv == 11 ,,

pp vv == CC &Sigma;&Sigma; ii == 11 nno &prime;&prime; kk [[ || || ythe y 00 -- xx ii hh || || 22 ]] &CenterDot;&Center Dot; &delta;&delta; [[ bb 22 (( xx ii )) -- vv ]] ,,

pu和pv分别表示目标模型灰度特征和LBP纹理特征的各级概率密度,m1=255和m2=255分别表示目标模型灰度特征和LBP纹理特征的量化级数,函数b1(·)是位于xi的像素向灰度特征索引的映像,函数b2(·)是位于xi的像素向LBP纹理特征索引的映像,δ是Delta函数,C是归一化系数,μ=1...255,v=1...255。p u and p v represent the probability density of each level of the target model gray feature and LBP texture feature respectively, m 1 =255 and m 2 =255 respectively represent the quantization series of the target model gray feature and LBP texture feature, the function b 1 (·) is the image of the pixel located at xi to the gray feature index, the function b 2 (·) is the image of the pixel located at xi to the LBP texture feature index, δ is the Delta function, C is the normalization coefficient, μ =1...255, v=1...255.

(4)分别计算灰度特征和LBP纹理特征的Bhattacharyya系数ρ1、ρ2以及权重系数α1、α2,利用ρ1、α1、ρ2、α2计算位置y0处的联合特征Bhattacharyya系数ρ;(4) Calculate Bhattacharyya coefficients ρ 1 , ρ 2 and weight coefficients α 1 , α 2 of the grayscale feature and LBP texture feature respectively, and use ρ 1 , α 1 , ρ 2 , α 2 to calculate the joint feature Bhattacharyya at position y 0 Coefficient ρ;

联合特征Bhattacharyya系数ρ描述为:The joint characteristic Bhattacharyya coefficient ρ is described as:

ρ=α1·ρ12·ρ2ρ=α 1 ·ρ 12 ·ρ 2 ,

&rho;&rho; 11 == &Sigma;&Sigma; uu == 11 mm 11 pp uu &CenterDot;&Center Dot; qq uu ,,

&rho;&rho; 22 == &Sigma;&Sigma; uu == 11 mm 22 pp vv &CenterDot;&Center Dot; qq vv ,,

灰度特征权重系数α1、LBP纹理特征α2更新表达式如下所示:The update expressions of gray feature weight coefficient α 1 and LBP texture feature α 2 are as follows:

&alpha;&alpha; 11 == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 11 ,, oldold ++ &lambda;&lambda; &CenterDot;&Center Dot; &alpha;&alpha; 11 ,, curcur == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 11 ,, oldold ++ &lambda;&lambda; &CenterDot;&Center Dot; &rho;&rho; 11 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

&alpha;&alpha; 22 == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 22 ,, oldold ++ &lambda;&lambda; &CenterDot;&CenterDot; &alpha;&alpha; 22 ,, curcur == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 22 ,, oldold ++ &lambda;&lambda; &CenterDot;&Center Dot; &rho;&rho; 22 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,,

其中,α1,old和α2,old是上一帧权重系数,α1,cur和α2,cur是当前帧权重系数,λ是比例系数。Among them, α 1, old and α 2, old is the weight coefficient of the previous frame, α 1, cur and α 2, cur is the weight coefficient of the current frame, and λ is the scale coefficient.

(5)计算当前帧目标跟踪新位置y1(5) Calculate the new position y 1 of target tracking in the current frame;

ythe y 11 == &alpha;&alpha; 11 &CenterDot;&Center Dot; &Sigma;&Sigma; ii == 11 nno &prime;&prime; xx ii &CenterDot;&CenterDot; ww ii ,, 11 &Sigma;&Sigma; ii == 11 nno &prime;&prime; ww ii ,, 11 ++ &alpha;&alpha; 22 &CenterDot;&CenterDot; &Sigma;&Sigma; ii == 11 nno &prime;&prime; xx ii &CenterDot;&CenterDot; ww ii ,, 22 &Sigma;&Sigma; ii == 11 nno &prime;&prime; ww ii ,, 22 ,,

ww ii ,, 11 == &Sigma;&Sigma; uu == 11 mm 11 qq uu pp uu (( ythe y 00 )) &CenterDot;&Center Dot; &delta;&delta; [[ bb 11 (( xx ii )) -- uu ]] ,,

ww ii ,, 22 == &Sigma;&Sigma; vv == 11 mm 22 qq vv pp vv (( ythe y 00 )) &CenterDot;&CenterDot; &delta;&delta; [[ bb 22 (( xx ii )) -- vv ]] ,,

其中,n'是候选目标像素点个数,h是核函数带宽,qu、qv、pu、pv、α1、α2、xi、b1(·)、b2(·)的含义与步骤(2)、(3)、(4)中的定义相同。Among them, n' is the number of candidate target pixels, h is the kernel function bandwidth, q u , q v , p u , p v , α 1 , α 2 , xi , b 1 (·), b 2 (·) The meaning of is the same as that defined in steps (2), (3), and (4).

(6)利用灰度特征和LBP纹理特征的Bhattacharyya系数ρ′1、ρ′2以及权重系数α′1、α′2,计算位置y1处联合特征Bhattacharyya系数ρ',表达式如下;(6) Using the Bhattacharyya coefficients ρ′ 1 , ρ′ 2 and the weight coefficients α′ 1 , α′ 2 of the gray feature and LBP texture feature, calculate the Bhattacharyya coefficient ρ' of the joint feature at position y 1 , the expression is as follows;

ρ'=α′1·ρ′1+α′2·ρ′2ρ'=α′ 1 ·ρ′ 1 +α′ 2 ·ρ′ 2 ,

&rho;&rho; 11 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 11 pp uu &prime;&prime; &CenterDot;&CenterDot; qq uu ,,

&rho;&rho; 22 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 22 pp vv &prime;&prime; &CenterDot;&CenterDot; qq vv ,,

灰度特征权重系数α′1、LBP纹理特征α′2更新表达式如下所示:The update expressions of gray feature weight coefficient α′ 1 and LBP texture feature α′ 2 are as follows:

&alpha;&alpha; 11 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&CenterDot; &alpha;&alpha; 11 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime;

&alpha;&alpha; 22 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&CenterDot; &alpha;&alpha; 22 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; ,,

其中,α′1,old和α′2,old是上一帧权重系数,λ是比例系数。Among them, α′ 1,old and α′ 2,old are the weight coefficients of the previous frame, and λ is the scale coefficient.

(7)当ρ'<ρ时,    否则y1保持不变;(7) When ρ'<ρ, Otherwise y1 remains unchanged;

(8)如果abs(y0-y1)<0.01则停止,否则,y0←y1,执行步骤(3)。(8) If abs(y 0 -y 1 )<0.01, stop, otherwise, y 0 ←y 1 , go to step (3).

图2为传统技术,图3为根据本实施例得到的仅利用单特征(灰度特征)和多特征融合的红外图像目标跟踪结果,因为为红外图像,所以不可避免出现灰度颜色。其中图2a、2b、2c、2d分别表示第20帧、第80帧、第140帧以及第200帧的图像,图3a、3b、3c、3d分别表示第20帧、第80帧、第140帧以及第200帧的图像。对比图2和图3发现:仅利用单特征对目标跟踪会引起跟踪过程不稳定,跟踪精度差,如图2跟踪波门随机摆动;利用多特征融合的跟踪方法能有效提高跟踪精度,如图3跟踪波门始终在目标型心附近。Figure 2 shows the traditional technology, and Figure 3 shows the infrared image target tracking results obtained according to this embodiment using only single feature (grayscale feature) and multi-feature fusion. Because it is an infrared image, grayscale colors inevitably appear. Among them, Figures 2a, 2b, 2c, and 2d represent the images of the 20th, 80th, 140th, and 200th frames respectively, and Figures 3a, 3b, 3c, and 3d represent the 20th, 80th, and 140th frames respectively and the image at frame 200. Comparing Figure 2 and Figure 3, it is found that only using a single feature to track the target will cause the tracking process to be unstable and the tracking accuracy is poor, as shown in Figure 2 to track the random swing of the wave gate; the tracking method using multi-feature fusion can effectively improve the tracking accuracy, as shown in Figure 2 3. The tracking wave gate is always near the target core.

本发明提供了一种多特征融合的红外图像目标实时跟踪方法,具体实现该技术方案的方法和途径很多,以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本发明的保护范围。本实施例中未明确的各组成部分均可用现有技术加以实现。The present invention provides a multi-feature fusion infrared image target real-time tracking method. There are many methods and approaches to specifically realize the technical solution. The above description is only a preferred embodiment of the present invention. As far as people are concerned, some improvements and modifications can be made without departing from the principle of the present invention, and these improvements and modifications should also be regarded as the protection scope of the present invention. All components that are not specified in this embodiment can be realized by existing technologies.

Claims (6)

1.一种多特征融合的红外图像目标实时跟踪方法,其特征在于,包括以下步骤:1. a multi-feature fusion infrared image target real-time tracking method, is characterized in that, comprises the following steps: (1)给定目标初始跟踪点位置y0(1) Given the target initial tracking point position y 0 ; (2)初始化目标模型,以初始跟踪点y0为中心建立目标灰度模型q1和目标LBP纹理模型q2(2) Initialize the target model, and establish the target grayscale model q 1 and the target LBP texture model q 2 centering on the initial tracking point y 0 ; (3)计算目标候选模型,根据目标的跟踪点位置y0,计算候选目标灰度模型p1(y0)和候选目标LBP纹理模型p2(y0);(3) Calculate the target candidate model, and calculate the candidate target grayscale model p 1 (y 0 ) and the candidate target LBP texture model p 2 (y 0 ) according to the target tracking point position y 0 ; (4)利用灰度特征Bhattacharyya系数ρ1和LBP纹理特征的Bhattacharyya系数ρ2,以及灰度特征的权重系数α1和LBP纹理特征的权重系数α2,计算位置y0处联合特征Bhattacharyya系数ρ,表达式如下:(4) Using the gray-scale feature Bhattacharyya coefficient ρ 1 and the Bhattacharyya coefficient ρ 2 of the LBP texture feature, and the weight coefficient α 1 of the gray-scale feature and the weight coefficient α 2 of the LBP texture feature, calculate the joint feature Bhattacharyya coefficient ρ at position y 0 , the expression is as follows: ρ=α1·ρ12·ρ2ρ=α 1 ·ρ 12 ·ρ 2 ; (5)计算当前帧目标新位置y1(5) Calculate the new position y 1 of the target in the current frame; (6)利用灰度特征Bhattacharyya系数ρ′1和LBP纹理特征Bhattacharyya系数ρ′2,以及灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2,计算位置y1处联合特征Bhattacharyya系数ρ',表达式如下;(6) Using the gray-scale feature Bhattacharyya coefficient ρ′ 1 and the LBP texture feature Bhattacharyya coefficient ρ′ 2 , and the weight coefficient α′ 1 of the gray-scale feature and the weight coefficient α′ 2 of the LBP texture feature, calculate the joint feature at position y 1 Bhattacharyya coefficient ρ', the expression is as follows; ρ=α′1·ρ′1+α′2·ρ′2ρ=α′ 1 ·ρ′ 1 +α′ 2 ·ρ′ 2 , (7)当ρ'<ρ时,
Figure FDA00002272056300011
否则y1保持不变;
(7) When ρ'<ρ,
Figure FDA00002272056300011
Otherwise y1 remains unchanged;
(8)若|(y0-y1)|<ε,停止计算,否则,将y1赋值给y0并执行步骤(3),其中,ε是误差常系数。(8) If |(y 0 -y 1 )|<ε, stop the calculation, otherwise, assign y 1 to y 0 and execute step (3), where ε is the constant error coefficient.
2.根据权利要求1所述的一种多特征融合的红外图像目标实时跟踪方法,其特征在于,步骤(2)中,目标灰度模型q1为:2. A multi-feature fusion infrared image target real-time tracking method according to claim 1, characterized in that, in step (2), the target grayscale model q 1 is: qq 11 == {{ qq uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 qq uu == 11 ,, 目标LBP纹理模型q2为:The target LBP texture model q 2 is: qq 22 == {{ qq vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 qq vv == 11 ,, 其中,qu为目标灰度模型灰度特征的各级概率密度,qv为LBP纹理特征的各级概率密度,m1为目标灰度模型灰度特征的最大量化级数范围,m2为LBP纹理特征的最大量化级数范围,u表示灰度量化级数,v表示纹理量化级数。Among them, q u is the probability density of each level of gray-scale features of the target gray-scale model, q v is the probability density of each level of LBP texture features, m 1 is the maximum quantization series range of gray-scale features of the target gray-scale model, and m 2 is The maximum quantization series range of LBP texture features, u indicates the gray quantization series, and v indicates the texture quantization series. 3.根据权利要求1所述的一种多特征融合的红外图像目标实时跟踪方法,其特征在于,步骤(3)中,候选目标灰度模型p1为:3. A multi-feature fusion infrared image target real-time tracking method according to claim 1, characterized in that in step (3), the candidate target gray model p 1 is: pp 11 == {{ pp uu }} uu == 11 .. .. .. mm 11 ,, &Sigma;&Sigma; uu == 11 mm 11 pp uu == 11 ,, 候选目标LBP纹理模型p2为:The candidate target LBP texture model p 2 is: pp 22 == {{ pp vv }} vv == 11 .. .. .. mm 22 ,, &Sigma;&Sigma; vv == 11 mm 22 pp vv == 11 ,, pu为目标灰度模型灰度特征的各级概率密度,pv为目标灰度模型灰度特征和LBP纹理特征的各级概率密度,m1为目标灰度模型灰度特征的最大量化级数范围,m2为LBP纹理特征的最大量化级数范围,u表示灰度量化级数,v表示纹理量化级数。p u is the probability density of all levels of gray features of the target gray scale model, p v is the probability density of all levels of gray features of the target gray scale model and LBP texture features, m 1 is the maximum quantization level of the gray scale features of the target gray scale model The number range, m 2 is the maximum quantization series range of LBP texture features, u indicates the gray quantization series, and v indicates the texture quantization series. 4.根据权利要求1所述的一种多特征融合的红外图像目标实时跟踪方法,其特征在于,步骤(4)中灰度特征的权重系数α1和LBP纹理特征的权重系数α2,步骤(6)中灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2采用迭代方式更新,计算式分别如下:4. A multi-feature fusion infrared image target real-time tracking method according to claim 1, characterized in that, in step (4), the weight coefficient α 1 of the gray feature and the weight coefficient α 2 of the LBP texture feature, the step (6) The weight coefficient α′1 of the medium grayscale feature and the weight coefficient α′2 of the LBP texture feature are updated iteratively, and the calculation formulas are as follows: α1=(1-λ)·α1,old+λ·α1,curα 1 =(1-λ)·α 1,old +λ·α 1,cur , α2=(1-λ)·α2,old+λ·α2,curα 2 =(1-λ)·α 2,old +λ·α 2,cur , α′1=(1-λ)·α′1,old+λ·α′1,curα′ 1 =(1-λ)·α′ 1,old +λ·α′ 1,cur , α′2=(1-λ)·α′2,old+λ·α′2,curα′ 2 =(1-λ)·α′ 2,old +λ·α′ 2,cur , 其中,α1,old和α2,old分别是步骤(4)中上一帧灰度特征和LBP纹理特征的权重系数,α1,cur和α2,cur分别是步骤(4)中当前帧灰度特征和LBP纹理特征的权重系数,α′1,old和α′2,old分别是步骤(6)中上一帧灰度特征和LBP纹理特征的权重系数,α′1,cur和α′2,cur是步骤(6)中当前帧灰度特征和LBP纹理特征的权重系数,λ是比例系数。Among them, α 1, old and α 2, old are the weight coefficients of the grayscale feature and LBP texture feature of the previous frame in step (4), respectively, α 1, cur and α 2, cur are the current frame in step (4) The weight coefficients of grayscale features and LBP texture features, α′ 1, old and α′ 2, old are the weight coefficients of the previous frame grayscale features and LBP texture features in step (6), α′ 1, cur and α ′ 2, cur is the weight coefficient of the current frame grayscale feature and LBP texture feature in step (6), and λ is the scale coefficient. 5.根据权利要求4所述的一种多特征融合的红外图像目标实时跟踪方法,其特征在于,步骤(4)中当前帧灰度特征的权重系数α1,cur和LBP纹理特征的权重系数α2,cur,步骤(6)中当前帧灰度特征的权重系数α′1,cur和LBP纹理特征的权重系数α′2,cur,计算式分别如下:5. the infrared image target real-time tracking method of a kind of multi-feature fusion according to claim 4, is characterized in that, in the step (4), the weight coefficient α 1 of current frame gray feature, the weight coefficient of cur and LBP texture feature α 2, cur , the weight coefficient α′ 1 of the grayscale feature of the current frame in step (6), cur and the weight coefficient α′ 2, cur of the LBP texture feature, the calculation formulas are as follows: &alpha;&alpha; 11 ,, curcur == &rho;&rho; 11 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,, &alpha;&alpha; 22 ,, curcur == &rho;&rho; 22 &rho;&rho; 11 22 ++ &rho;&rho; 22 22 ,, &alpha;&alpha; 11 ,, curcur &prime;&prime; == &rho;&rho; 11 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; ,, &alpha;&alpha; 22 ,, curcur &prime;&prime; == &rho;&rho; 22 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&Center Dot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; ,, 其中,ρ1是步骤(4)中灰度特征的Bhattacharyya系数,ρ2是步骤(4)中LBP纹理特征的Bhattacharyya系数,ρ′1是步骤(6)中灰度特征的Bhattacharyya系数,ρ′2是步骤(6)中LBP纹理特征的Bhattacharyya系数。Among them, ρ1 is the Bhattacharyya coefficient of the grayscale feature in step (4), ρ2 is the Bhattacharyya coefficient of the LBP texture feature in step (4), ρ′1 is the Bhattacharyya coefficient of the grayscale feature in step (6), ρ′ 2 is the Bhattacharyya coefficient of the LBP texture feature in step (6). 6.根据权利要求5所述的一种多特征融合的红外图像目标实时跟踪方法,其特征在于,步骤(6)中灰度特征Bhattacharyya系数ρ′1和LBP纹理特征的Bhattacharyya系数ρ′2,以及灰度特征的权重系数α′1和LBP纹理特征的权重系数α′2通过以下公式得到:6. The infrared image target real-time tracking method of a kind of multi-feature fusion according to claim 5, characterized in that, in the step (6), the Bhattacharyya coefficient ρ'1 of the grayscale feature and the Bhattacharyya coefficient ρ'2 of the LBP texture feature, And the weight coefficient α′ 1 of the gray feature and the weight coefficient α′ 2 of the LBP texture feature are obtained by the following formula: &rho;&rho; 11 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 11 pp uu &prime;&prime; &CenterDot;&Center Dot; qq uu ,, &rho;&rho; 22 &prime;&prime; == &Sigma;&Sigma; uu == 11 mm 22 pp vv &prime;&prime; &CenterDot;&CenterDot; qq vv ,, p′u是位置y1处的目标灰度模型灰度特征的各级概率密度,p′v是位置y1处LBP纹理特征的各级概率密度; p'u is the probability density of all levels of the grayscale features of the target grayscale model at position y1 , and p'v is the probability density of all levels of LBP texture features at position y1 ; &alpha;&alpha; 11 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 11 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&Center Dot; &rho;&rho; 11 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; &alpha;&alpha; 22 &prime;&prime; == (( 11 -- &lambda;&lambda; )) &CenterDot;&Center Dot; &alpha;&alpha; 22 ,, oldold &prime;&prime; ++ &lambda;&lambda; &CenterDot;&CenterDot; &rho;&rho; 22 &prime;&prime; &rho;&rho; 11 &prime;&prime; &CenterDot;&CenterDot; &rho;&rho; 11 &prime;&prime; ++ &rho;&rho; 22 &prime;&prime; &CenterDot;&Center Dot; &rho;&rho; 22 &prime;&prime; ,, 其中,α′1,old是上一帧灰度特征的权重系数,α′2,old是上一帧LBP纹理特征的权重系数,λ是比例系数。Among them, α′ 1,old is the weight coefficient of the grayscale feature of the previous frame, α′ 2,old is the weight coefficient of the LBP texture feature of the previous frame, and λ is the scale coefficient.
CN201210397686.3A 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion Expired - Fee Related CN102930558B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210397686.3A CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210397686.3A CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Publications (2)

Publication Number Publication Date
CN102930558A true CN102930558A (en) 2013-02-13
CN102930558B CN102930558B (en) 2015-04-01

Family

ID=47645348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210397686.3A Expired - Fee Related CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Country Status (1)

Country Link
CN (1) CN102930558B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109215062A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 Motion capture method, binocular positioning device and system based on image vision
CN109902578A (en) * 2019-01-25 2019-06-18 南京理工大学 An infrared target detection and tracking method
CN110785775A (en) * 2017-07-07 2020-02-11 三星电子株式会社 System and method for optical tracking
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI628624B (en) * 2017-11-30 2018-07-01 國家中山科學研究院 Improved thermal image feature extraction method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590999B1 (en) * 2000-02-14 2003-07-08 Siemens Corporate Research, Inc. Real-time tracking of non-rigid objects using mean shift

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590999B1 (en) * 2000-02-14 2003-07-08 Siemens Corporate Research, Inc. Real-time tracking of non-rigid objects using mean shift

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪首坤,郭俊杰,王军政: "基于自适应特征融合的均值迁移目标跟踪", 《北京理工大学学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109215062A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 Motion capture method, binocular positioning device and system based on image vision
CN109215062B (en) * 2017-06-29 2022-02-08 沈阳新松机器人自动化股份有限公司 Motion capture method based on image vision, binocular positioning device and system
CN110785775A (en) * 2017-07-07 2020-02-11 三星电子株式会社 System and method for optical tracking
CN110785775B (en) * 2017-07-07 2023-12-01 三星电子株式会社 Systems and methods for optical tracking
CN109902578A (en) * 2019-01-25 2019-06-18 南京理工大学 An infrared target detection and tracking method
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Also Published As

Publication number Publication date
CN102930558B (en) 2015-04-01

Similar Documents

Publication Publication Date Title
CN106846359B (en) Moving target rapid detection method based on video sequence
CN110120041A (en) Pavement crack image detecting method
CN110060277A (en) A kind of vision SLAM method of multiple features fusion
CN103227888B (en) A kind of based on empirical mode decomposition with the video stabilization method of multiple interpretational criteria
CN110473231B (en) A target tracking method using twin fully convolutional networks with a predictive learning update strategy
CN106780576A (en) A kind of camera position and orientation estimation method towards RGBD data flows
CN101551909B (en) Tracking method based on kernel and target continuous adaptive distribution characteristics
CN105335986A (en) Characteristic matching and MeanShift algorithm-based target tracking method
CN103208115B (en) Based on the saliency method for detecting area of geodesic line distance
CN104166987B (en) Parallax estimation method based on improved adaptive weighted summation and belief propagation
CN101916446A (en) Gray Target Tracking Algorithm Based on Edge Information and Mean Shift
CN104050488A (en) Hand gesture recognition method based on switching Kalman filtering model
CN102129695A (en) Target tracking method based on modeling of occluder under condition of having occlusion
CN113239749B (en) Cross-domain point cloud semantic segmentation method based on multi-modal joint learning
CN102930558A (en) Real-time tracking method for infrared image target with multi-feature fusion
CN116664892A (en) Multi-temporal remote sensing image registration method based on cross attention and deformable convolution
CN103077530A (en) Moving object detection method based on improved mixing gauss and image cutting
CN101964112B (en) Adaptive prior shape-based image segmentation method
CN110458862A (en) A Tracking Method for Moving Objects in Occluded Background
CN103218827A (en) Contour tracing method based on shape-transmitting united division and image-matching correction
CN106447662A (en) Combined distance based FCM image segmentation algorithm
CN104933719A (en) Method for detecting image edge by integral image interblock distance
CN101833750A (en) Active Contouring Method and System Based on Shape Constraint and Orientation Field
CN109033969B (en) Infrared target detection method based on Bayesian saliency map calculation model
CN107424172B (en) Moving target tracking method based on foreground discrimination and circular search method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150401