[go: up one dir, main page]

CN102930558A - Real-time tracking method for infrared image target with multi-feature fusion - Google Patents

Real-time tracking method for infrared image target with multi-feature fusion Download PDF

Info

Publication number
CN102930558A
CN102930558A CN2012103976863A CN201210397686A CN102930558A CN 102930558 A CN102930558 A CN 102930558A CN 2012103976863 A CN2012103976863 A CN 2012103976863A CN 201210397686 A CN201210397686 A CN 201210397686A CN 102930558 A CN102930558 A CN 102930558A
Authority
CN
China
Prior art keywords
prime
rho
target
textural characteristics
weight coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012103976863A
Other languages
Chinese (zh)
Other versions
CN102930558B (en
Inventor
白俊奇
赵春光
王寿峰
翟尚礼
汪洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 28 Research Institute
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN201210397686.3A priority Critical patent/CN102930558B/en
Publication of CN102930558A publication Critical patent/CN102930558A/en
Application granted granted Critical
Publication of CN102930558B publication Critical patent/CN102930558B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明公开了一种多特征融合的红外图像目标实时跟踪方法,包括以下步骤:初始化目标跟踪点位置;初始化目标模型;计算目标候选模型;计算的联合特征Bhattacharyya系数及特征间权重系数;计算当前帧目标跟踪新位置;估计新位置的联合特征Bhattacharyya系数;比较前后两个联合特征Bhattacharyya系数,并输出结果。本发明自适应计算多特征间的权重系数,增强了目标跟踪鲁棒性,保证了目标跟踪稳定性,解决了单一特征不稳定造成的跟踪点漂移问题,有效提高了目标跟踪精度。

Figure 201210397686

The invention discloses a multi-feature fusion infrared image target real-time tracking method, comprising the following steps: initializing the target tracking point position; initializing the target model; calculating the target candidate model; calculating the joint feature Bhattacharyya coefficient and the weight coefficient between features; The frame target tracks the new position; estimates the joint feature Bhattacharyya coefficient of the new position; compares the two joint feature Bhattacharyya coefficients before and after, and outputs the result. The invention self-adaptively calculates the weight coefficient among multiple features, enhances the robustness of target tracking, ensures the stability of target tracking, solves the problem of tracking point drift caused by the instability of a single feature, and effectively improves the target tracking accuracy.

Figure 201210397686

Description

A kind of infrared image object real-time tracking method of many Fusion Features
Technical field
The present invention designs a kind of infrared image method for tracking target of many Fusion Features, particularly a kind of infrared image method for tracking target of suitable hardware real-time implementation.
Background technology
In recent years, along with the development of integrated circuit technology and infra-red material, infrared imagery technique has been obtained very much progress, is used widely in national defense construction and national economy field.Yet, to compare with visible images, the infrared image signal to noise ratio (S/N ratio) is relatively low, therefore can only provide limited information when carrying out the infrared image target detection and following the tracks of.Because target signature is not obvious in the infrared image, has the large problems such as background clutter, causes the accurate tracking of infrared image target to become more difficult.
At present, target tracking algorism is divided into based on the tracking of model with based on the tracking two large classes of outward appearance.Compare with the model following method, the outward appearance tracing has avoided setting up the complex process of model, has wider engineering practical value.Wherein, average drifting track algorithm, robust simple because of it, characteristics that real-time is good are used widely in target following.Average drifting is a kind of without ginseng density calculation method, by the iterative search distribution pattern the most similar to sample distribution repeatedly.The people such as Comaniciu propose a kind of average drifting target tracking algorism by seeking the maximum value of color of object histogram and candidate target color histogram similarity.The people such as Chu are used for the Kalman wave filter primary iteration position of prediction Mean Shift, but when target is seriously blocked, because the source location that Mean Shift algorithm searches out is inaccurate, have certain deviation.A kind of adaptive tracking method that can choose easy identification color characteristic of the propositions such as Collins, wherein the candidate color feature set comprises 49 stack features that calculated by pixel R, G, the linear combination of B value. because the Candidate Set that adopts is larger, the computing expense of Feature Selection is also very large.Therefore, there is following shortcoming in existing target tracking algorism: (1) classical target tracking algorism adopts single features to describe target, poor anti jamming capability; (2) most existing many features target tracking algorisms only utilize the weight coefficient between the present frame calculated characteristics, carry out complexity when target and change the track algorithm poor robustness; (3) most existing track algorithms are in the situation that non-rigid deformation, partial occlusion and overlapping appear in target, and tracking accuracy descends, even track rejection occurs; (4) most existing track algorithms have increased algorithm complex greatly when improving target tracking accuracy, are difficult for the hardware real-time implementation.
Summary of the invention
Goal of the invention: technical matters to be solved by this invention is for the deficiencies in the prior art, and a kind of infrared image object real-time tracking method of many Fusion Features is provided.
In order to solve the problems of the technologies described above, the invention discloses a kind of infrared image object real-time tracking method of many Fusion Features, may further comprise the steps:
(1) initialization tracking position position y 0Initial trace point is by artificial appointment;
(2) initialization object module is with initial trace point y 0Centered by set up target gray level model q 1With target LBP texture model q 2(local binary pattern, LBP) local binary patterns.
(3) calculate the target candidate model, according to the trace point position y of target 0, calculated candidate target gray level model p 1(y 0) and candidate target LBP texture model p 2(y 0);
(4) utilize gray feature Bhattacharyya(Batachelia Bhattacharyya, referring to Visual C++ Digital Image Processing, the 466th page, author: Xie Fengying, first published in 2008, Electronic Industry Press.) coefficient ρ 1Bhattacharyya coefficient ρ with the LBP textural characteristics 2, and the weight coefficient α of gray feature 1Weight coefficient α with the LBP textural characteristics 2, calculating location y 0The union feature Bhattacharyya of place coefficient ρ, expression formula is as follows:
ρ=α 1·ρ 12·ρ 2
(5) calculate present frame target reposition y 1
(6) utilize gray feature Bhattacharyya coefficient ρ ' 1With LBP textural characteristics Bhattacharyya coefficient ρ ' 2, and the weight coefficient α ' of gray feature 1Weight coefficient α ' with the LBP textural characteristics 2, calculating location y 1The union feature Bhattacharyya of place coefficient ρ ', expression formula is as follows;
ρ=α′ 1·ρ′ 1+α′ 2·ρ′ 2
(7) when ρ '<ρ,
Figure BDA00002272056400021
Otherwise y 1Remain unchanged;
(8) if | (y 0-y 1) |<ε, stop to calculate, otherwise, with y 1Assignment is to y 0And execution in step (3), wherein, ε is the error constant coefficient.
In the step (2), target gray level model q 1For:
q 1 = { q u } u = 1 . . . m 1 , Σ u = 1 m 1 q u = 1 ,
Target LBP texture model q 2For:
q 2 = { q v } v = 1 . . . m 2 , Σ v = 1 m 2 q v = 1 ,
Wherein, q uBe the probability density at different levels of target gray level model gray feature, q vBe the probability density at different levels of LBP textural characteristics, m 1Be the maximum quantification progression scope of target gray level model gray feature, m 2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
In the step (3), candidate target gray level model p 1For:
p 1 = { p u } u = 1 . . . m 1 , Σ u = 1 m 1 p u = 1 ,
Candidate target LBP texture model p 2For:
p 2 = { p v } v = 1 . . . m 2 , Σ v = 1 m 2 p v = 1 ,
p uBe the probability density at different levels of target gray level model gray feature, p vBe the probability density at different levels of target gray level model gray feature and LBP textural characteristics, m 1Be the maximum quantification progression scope of target gray level model gray feature, m 2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
The weight coefficient α of gray feature in the step (4) 1Weight coefficient α with the LBP textural characteristics 2, the weight coefficient α ' of gray feature in the step (6) 1Weight coefficient α ' with the LBP textural characteristics 2The employing iterative manner is upgraded, and calculating formula is as follows respectively:
α 1=(1-λ)·α 1,old+λ·α 1,cur
α 2=(1-λ)·α 2,old+λ·α 2,cur
α′ 1=(1-λ)·α′ 1,old+λ·α′ 1,cur
α′ 2=(1-λ)·α′ 2,old+λ·α′ 2,cur
Wherein, α 1, oldAnd α 2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (4) and LBP textural characteristics, α 1, curAnd α 2, curRespectively the weight coefficient of the middle present frame gray feature of step (4) and LBP textural characteristics, α ' 1, oldAnd α ' 2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (6) and LBP textural characteristics, α ' 1, curAnd α ' 2, curBe the weight coefficient of the middle present frame gray feature of step (6) and LBP textural characteristics, λ is scale-up factor.0≤λ≤1, the speed of convergence of decision weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
The weight coefficient α of present frame gray feature in the step (4) 1, curWeight coefficient α with the LBP textural characteristics 2, cur, the weight coefficient α ' of present frame gray feature in the step (6) 1, curWeight coefficient α ' with the LBP textural characteristics 2, cur, calculating formula is as follows respectively:
α 1 , cur = ρ 1 ρ 1 2 + ρ 2 2 ,
α 2 , cur = ρ 2 ρ 1 2 + ρ 2 2 ,
α 1 , cur ′ = ρ 1 ′ ρ 1 ′ · ρ 1 ′ + ρ 2 ′ · ρ 2 ′ ,
α 2 , cur ′ = ρ 2 ′ ρ 1 ′ · ρ 1 ′ + ρ 2 ′ · ρ 2 ′ ,
Wherein, ρ 1The Bhattacharyya coefficient of gray feature in the step (4), ρ 2The Bhattacharyya coefficient of LBP textural characteristics in the step (4), ρ ' 1The Bhattacharyya coefficient of gray feature in the step (6), ρ ' 2It is the Bhattacharyya coefficient of LBP textural characteristics in the step (6).
Gray feature Bhattacharyya coefficient ρ ' in the step (6) 1Bhattacharyya coefficient ρ ' with the LBP textural characteristics 2, and the weight coefficient α ' of gray feature 1Weight coefficient α ' with the LBP textural characteristics 2Obtain by following formula:
ρ 1 ′ = Σ u = 1 m 1 p u ′ · q u ,
ρ 2 ′ = Σ u = 1 m 2 p v ′ · q v ,
P ' uPosition y 1The probability density at different levels of the target gray level model gray feature at place, p ' vPosition y 1The probability density at different levels of the LBP of place textural characteristics;
α 1 ′ = ( 1 - λ ) · α 1 , old ′ + λ · ρ 1 ′ ρ 1 ′ · ρ 1 ′ + ρ 2 ′ · ρ 2 ′
α 2 ′ = ( 1 - λ ) · α 2 , old ′ + λ · ρ 2 ′ ρ 1 ′ · ρ 1 ′ + ρ 2 ′ · ρ 2 ′ ,
Wherein, α ' 1, oldThe weight coefficient of previous frame gray feature, α ' 2, oldBe the weight coefficient of previous frame LBP textural characteristics, λ is scale-up factor.0≤λ≤1, the speed of convergence of decision weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, use Yi Pannieqiekefu (Epanechnikov) kernel function to calculate gray feature probability histogram and LBP textural characteristics probability histogram.
The present invention compared with prior art has following remarkable advantage: (1) according to characteristic remarkable and the similarity of target and background, self-adaptation is calculated the weight coefficient between many features, has strengthened the target following robustness; (2) the employing iterative manner is upgraded the weight coefficient between many features, has guaranteed target following stability; (3) utilize many Fusion Features method for tracking target that the infrared image target is followed the tracks of, solved the unstable trace point drifting problem that causes of single features, Effective Raise target tracking accuracy; (4) many Fusion Features method for tracking target of the present invention's proposition does not exist high exponent arithmetic(al) and labyrinth, and algorithm operation quantity is little, is easy to the hardware real-time implementation.
Description of drawings
Below in conjunction with the drawings and specific embodiments the present invention is done further to specify, the advantage of above and other of the present invention aspect will become apparent.
Fig. 1 is process flow diagram of the present invention.
Fig. 2 a ~ 2d is traditional single feature (gray scale) infrared image target following result.
Fig. 3 a ~ 3d is the infrared image target following result of the many Fusion Features of the present invention.
Embodiment
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, utilize gray feature and LBP textural characteristics to describe the infrared image clarification of objective.
Eight neighborhood LBP textural characteristics expression formula LBP 8,1As follows:
LB P 8,1 = Σ n = 0 7 s ( g n - g c ) · 2 n ,
s ( x ) = 1 , x &GreaterEqual; 0 0 , x < 0
Wherein, g cCurrent point, g nNeighborhood point on every side, n=0..7.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, utilize the Bhattacharyya coefficient to describe similarity between object module and target candidate model.
Bhattacharyya coefficient ρ BhaExpression formula is as follows:
&rho; Bha = &Sigma; u = 1 m p &CenterDot; q ,
Wherein, p is the target candidate model, and q is object module.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, the weight coefficient α of gray feature 1Weight coefficient α with the LBP textural characteristics 2The employing iterative manner is upgraded, and expression formula is as follows:
α 1=(1-λ)·α 1,old+λ·α 1,cur
α 2=(1-λ)·α 2,old+λ·α 2,cur
Wherein, α 1, oldAnd α 2, oldRespectively the weight coefficient of previous frame gray feature and LBP textural characteristics, α 1, curAnd α 2, curBe the weight coefficient of present frame gray feature and LBP textural characteristics, λ is scale-up factor.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, the weight coefficient α of present frame gray feature 1, curWeight coefficient α with the LBP textural characteristics 2, curExpression formula is as follows respectively:
&alpha; 1 , cur = &rho; 1 &rho; 1 2 + &rho; 2 2 ,
&alpha; 2 , cur = &rho; 2 &rho; 1 2 + &rho; 2 2 ,
Wherein, ρ 1The Bhattacharyya coefficient of gray feature, ρ 2It is the Bhattacharyya coefficient of LBP textural characteristics.
Embodiment 1
As shown in Figure 1, the below illustrates the infrared image object real-time tracking method of the many Fusion Features of the present invention with example.The number of pixels 320 * 240 of infrared image, frame frequency 25HZ.The thermal infrared imager imaging is by the special image disposable plates of optical fiber transmission to the DSP+FPGA framework, and the infrared image target following of many Fusion Features realizes in dsp processor, satisfies the demand of processing in real time, and the implementation step is as follows:
(1) initialization tracking position position y 0, initial trace point is by artificial appointment.
The artificial initial target trace point position (i, j) of specifying, i=80, j=100(is shown in Figure 2), set Yi Pannieqiekefu (Epanechnikov) kernel function bandwidth h=10.
(2) initialization object module is set up target gray level model q according to gray feature 1, the LBP textural characteristics of calculating target is set up target texture model q in conjunction with the LBP textural characteristics 2
Calculating is centered by initial trace point position (80,100), and bandwidth h=10 is the image I of scope LBPThe LBP textural characteristics, expression formula is as follows:
I LBP = &Sigma; k 1 = 75 85 &Sigma; k 2 = 95 105 LB P 8,1 ( k 1 , k 2 )
Eight neighborhood LBP textural characteristics expression formula LBP 8,1As follows:
LB P 8,1 ( k 1 , k 2 ) = &Sigma; n = 0 7 s ( g n - g c ) &CenterDot; 2 n ,
s ( x ) = 1 , x &GreaterEqual; 0 0 , x < 0 ,
Wherein, g cThe current goal point, c=k2*320+k1, g nG cNeighborhood point on every side, n=0..7.
Target gray level model q 1For:
q 1 = { q u } u = 1 . . . m 1 , &Sigma; u = 1 m 1 q u = 1 ,
q u = C &Sigma; i = 1 n k [ | | y 0 - x i h | | 2 ] &CenterDot; &delta; [ b 1 ( x i ) - &mu; ] ,
Target LBP texture model q 2For:
q 2 = { q v } v = 1 . . . m 2 , &Sigma; v = 1 m 2 q v = 1 ,
q v = C &Sigma; i = 1 n &prime; k [ | | y 0 - x i h | | 2 ] &CenterDot; &delta; [ b 2 ( x i ) - v ] ,
Wherein, q uAnd q vThe probability density at different levels that represent respectively object module gray feature and LBP textural characteristics, m 1=255 and m 2=255 represent respectively the quantification progression of object module gray feature and LBP textural characteristics, function b 1() is to be positioned at x iPixel to the reflection of gray feature index, function b 2() is to be positioned at x iPixel to the reflection of LBP textural characteristics index, δ is the Delta function, C is normalization coefficient, μ=1...255, v=1...255.
(3) calculate the target candidate model.According to trace point position y 0, calculated candidate target gray level model p 1(y 0) and target texture candidate family p 2(y 0);
Candidate target gray level model p 1For:
p 1 = { p u } u = 1 . . . m 1 , &Sigma; u = 1 m 1 p u = 1 ,
p u = C &Sigma; i = 1 n k [ | | y 0 - x i h | | 2 ] &CenterDot; &delta; [ b 1 ( x i ) - &mu; ] ,
Candidate target LBP texture model p 2For:
p 2 = { p v } v = 1 . . . m 2 , &Sigma; v = 1 m 2 p v = 1 ,
p v = C &Sigma; i = 1 n &prime; k [ | | y 0 - x i h | | 2 ] &CenterDot; &delta; [ b 2 ( x i ) - v ] ,
p uAnd p vThe probability density at different levels that represent respectively object module gray feature and LBP textural characteristics, m 1=255 and m 2=255 represent respectively the quantification progression of object module gray feature and LBP textural characteristics, function b 1() is to be positioned at x iPixel to the reflection of gray feature index, function b 2() is to be positioned at x iPixel to the reflection of LBP textural characteristics index, δ is the Delta function, C is normalization coefficient, μ=1...255, v=1...255.
(4) calculate respectively the Bhattacharyya coefficient ρ of gray feature and LBP textural characteristics 1, ρ 2And weight coefficient α 1, α 2, utilize ρ 1, α 1, ρ 2, α 2Calculating location y 0The union feature Bhattacharyya coefficient ρ at place;
Union feature Bhattacharyya coefficient ρ is described as:
ρ=α 1·ρ 12·ρ 2
&rho; 1 = &Sigma; u = 1 m 1 p u &CenterDot; q u ,
&rho; 2 = &Sigma; u = 1 m 2 p v &CenterDot; q v ,
Gray feature weight coefficient α 1, LBP textural characteristics α 2The renewal expression formula is as follows:
&alpha; 1 = ( 1 - &lambda; ) &CenterDot; &alpha; 1 , old + &lambda; &CenterDot; &alpha; 1 , cur = ( 1 - &lambda; ) &CenterDot; &alpha; 1 , old + &lambda; &CenterDot; &rho; 1 &rho; 1 2 + &rho; 2 2 ,
&alpha; 2 = ( 1 - &lambda; ) &CenterDot; &alpha; 2 , old + &lambda; &CenterDot; &alpha; 2 , cur = ( 1 - &lambda; ) &CenterDot; &alpha; 2 , old + &lambda; &CenterDot; &rho; 2 &rho; 1 2 + &rho; 2 2 ,
Wherein, α 1, oldAnd α 2, oldThe previous frame weight coefficient, α 1, curAnd α 2, curBe the present frame weight coefficient, λ is scale-up factor.
(5) calculate present frame target following reposition y 1
y 1 = &alpha; 1 &CenterDot; &Sigma; i = 1 n &prime; x i &CenterDot; w i , 1 &Sigma; i = 1 n &prime; w i , 1 + &alpha; 2 &CenterDot; &Sigma; i = 1 n &prime; x i &CenterDot; w i , 2 &Sigma; i = 1 n &prime; w i , 2 ,
w i , 1 = &Sigma; u = 1 m 1 q u p u ( y 0 ) &CenterDot; &delta; [ b 1 ( x i ) - u ] ,
w i , 2 = &Sigma; v = 1 m 2 q v p v ( y 0 ) &CenterDot; &delta; [ b 2 ( x i ) - v ] ,
Wherein, n' is candidate target pixel number, and h is the kernel function bandwidth, q u, q v, p u, p v, α 1, α 2, x i, b 1(), b 2The implication of () is identical with the definition in step (2), (3), (4).
(6) utilize the Bhattacharyya coefficient ρ ' of gray feature and LBP textural characteristics 1, ρ ' 2And weight coefficient α ' 1, α ' 2, calculating location y 1The union feature Bhattacharyya of place coefficient ρ ', expression formula is as follows;
ρ'=α′ 1·ρ′ 1+α′ 2·ρ′ 2
&rho; 1 &prime; = &Sigma; u = 1 m 1 p u &prime; &CenterDot; q u ,
&rho; 2 &prime; = &Sigma; u = 1 m 2 p v &prime; &CenterDot; q v ,
Gray feature weight coefficient α ' 1, LBP textural characteristics α ' 2The renewal expression formula is as follows:
&alpha; 1 &prime; = ( 1 - &lambda; ) &CenterDot; &alpha; 1 , old &prime; + &lambda; &CenterDot; &rho; 1 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime;
&alpha; 2 &prime; = ( 1 - &lambda; ) &CenterDot; &alpha; 2 , old &prime; + &lambda; &CenterDot; &rho; 2 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime; ,
Wherein, α ' 1, oldAnd α ' 2, oldBe the previous frame weight coefficient, λ is scale-up factor.
(7) when ρ '<ρ, Otherwise y 1Remain unchanged;
(8) if abs is (y 0-y 1)<0.01 item stops, otherwise, y 0← y 1, execution in step (3).
Fig. 2 is conventional art, and Fig. 3 is for the single feature (gray feature) of only utilization that obtains according to the present embodiment and the infrared image target following result of many Fusion Features, because be infrared image, so greyscale color unavoidably occurs.Wherein Fig. 2 a, 2b, 2c, 2d represent respectively the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame, and Fig. 3 a, 3b, 3c, 3d represent respectively the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame.Comparison diagram 2 and Fig. 3 find: only utilize single feature that the target following meeting is caused that tracing process is unstable, tracking accuracy is poor, swings at random such as Fig. 2 tracking gate; Utilize the tracking of many Fusion Features can Effective Raise tracking accuracy, such as Fig. 3 tracking gate all the time near the object type heart.
The invention provides a kind of infrared image object real-time tracking method of many Fusion Features; method and the approach of this technical scheme of specific implementation are a lot; the above only is preferred implementation of the present invention; should be understood that; for those skilled in the art; under the prerequisite that does not break away from the principle of the invention, can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.In the present embodiment not clear and definite each ingredient all available prior art realized.

Claims (6)

1. the infrared image object real-time tracking method of Fusion Features more than a kind is characterized in that, may further comprise the steps:
(1) gives the initial trace point position y that sets the goal 0
(2) initialization object module is with initial trace point y 0Centered by set up target gray level model q 1With target LBP texture model q 2
(3) calculate the target candidate model, according to the trace point position y of target 0, calculated candidate target gray level model p 1(y 0) and candidate target LBP texture model p 2(y 0);
(4) utilize gray feature Bhattacharyya coefficient ρ 1Bhattacharyya coefficient ρ with the LBP textural characteristics 2, and the weight coefficient α of gray feature 1Weight coefficient α with the LBP textural characteristics 2, calculating location y 0The union feature Bhattacharyya of place coefficient ρ, expression formula is as follows:
ρ=α 1·ρ 12·ρ 2
(5) calculate present frame target reposition y 1
(6) utilize gray feature Bhattacharyya coefficient ρ ' 1With LBP textural characteristics Bhattacharyya coefficient ρ ' 2, and the weight coefficient α ' of gray feature 1Weight coefficient α ' with the LBP textural characteristics 2, calculating location y 1The union feature Bhattacharyya of place coefficient ρ ', expression formula is as follows;
ρ=α′ 1·ρ′ 1+α′ 2·ρ′ 2
(7) when ρ '<ρ,
Figure FDA00002272056300011
Otherwise y 1Remain unchanged;
(8) if | (y 0-y 1) |<ε, stop to calculate, otherwise, with y 1Assignment is to y 0And execution in step (3), wherein, ε is the error constant coefficient.
2. the infrared image object real-time tracking method of a kind of many Fusion Features according to claim 1 is characterized in that, in the step (2), and target gray level model q 1For:
q 1 = { q u } u = 1 . . . m 1 , &Sigma; u = 1 m 1 q u = 1 ,
Target LBP texture model q 2For:
q 2 = { q v } v = 1 . . . m 2 , &Sigma; v = 1 m 2 q v = 1 ,
Wherein, q uBe the probability density at different levels of target gray level model gray feature, q vBe the probability density at different levels of LBP textural characteristics, m 1Be the maximum quantification progression scope of target gray level model gray feature, m 2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
3. the infrared image object real-time tracking method of a kind of many Fusion Features according to claim 1 is characterized in that, in the step (3), and candidate target gray level model p 1For:
p 1 = { p u } u = 1 . . . m 1 , &Sigma; u = 1 m 1 p u = 1 ,
Candidate target LBP texture model p 2For:
p 2 = { p v } v = 1 . . . m 2 , &Sigma; v = 1 m 2 p v = 1 ,
p uBe the probability density at different levels of target gray level model gray feature, p vBe the probability density at different levels of target gray level model gray feature and LBP textural characteristics, m 1Be the maximum quantification progression scope of target gray level model gray feature, m 2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
4. the infrared image object real-time tracking method of a kind of many Fusion Features according to claim 1 is characterized in that, the weight coefficient α of gray feature in the step (4) 1Weight coefficient α with the LBP textural characteristics 2, the weight coefficient α ' 1 of gray feature and the weight coefficient α ' of LBP textural characteristics in the step (6) 2The employing iterative manner is upgraded, and calculating formula is as follows respectively:
α 1=(1-λ)·α 1,old+λ·α 1,cur
α 2=(1-λ)·α 2,old+λ·α 2,cur
α′ 1=(1-λ)·α′ 1,old+λ·α′ 1,cur
α′ 2=(1-λ)·α′ 2,old+λ·α′ 2,cur
Wherein, α 1, oldAnd α 2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (4) and LBP textural characteristics, α 1, curAnd α 2, curRespectively the weight coefficient of the middle present frame gray feature of step (4) and LBP textural characteristics, α ' 1, oldAnd α ' 2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (6) and LBP textural characteristics, α ' 1, curAnd α ' 2, curBe the weight coefficient of the middle present frame gray feature of step (6) and LBP textural characteristics, λ is scale-up factor.
5. the infrared image object real-time tracking method of a kind of many Fusion Features according to claim 4 is characterized in that, the weight coefficient α of present frame gray feature in the step (4) 1, curWeight coefficient α with the LBP textural characteristics 2, cur, the weight coefficient α ' of present frame gray feature in the step (6) 1, curWeight coefficient α ' with the LBP textural characteristics 2, cur, calculating formula is as follows respectively:
&alpha; 1 , cur = &rho; 1 &rho; 1 2 + &rho; 2 2 ,
&alpha; 2 , cur = &rho; 2 &rho; 1 2 + &rho; 2 2 ,
&alpha; 1 , cur &prime; = &rho; 1 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime; ,
&alpha; 2 , cur &prime; = &rho; 2 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime; ,
Wherein, ρ 1The Bhattacharyya coefficient of gray feature in the step (4), ρ 2The Bhattacharyya coefficient of LBP textural characteristics in the step (4), ρ ' 1The Bhattacharyya coefficient of gray feature in the step (6), ρ ' 2It is the Bhattacharyya coefficient of LBP textural characteristics in the step (6).
6. the infrared image object real-time tracking method of a kind of many Fusion Features according to claim 5 is characterized in that, gray feature Bhattacharyya coefficient ρ ' in the step (6) 1Bhattacharyya coefficient ρ ' with the LBP textural characteristics 2, and the weight coefficient α ' of gray feature 1Weight coefficient α ' with the LBP textural characteristics 2Obtain by following formula:
&rho; 1 &prime; = &Sigma; u = 1 m 1 p u &prime; &CenterDot; q u ,
&rho; 2 &prime; = &Sigma; u = 1 m 2 p v &prime; &CenterDot; q v ,
P ' uPosition y 1The probability density at different levels of the target gray level model gray feature at place, p ' vPosition y 1The probability density at different levels of the LBP of place textural characteristics;
&alpha; 1 &prime; = ( 1 - &lambda; ) &CenterDot; &alpha; 1 , old &prime; + &lambda; &CenterDot; &rho; 1 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime;
&alpha; 2 &prime; = ( 1 - &lambda; ) &CenterDot; &alpha; 2 , old &prime; + &lambda; &CenterDot; &rho; 2 &prime; &rho; 1 &prime; &CenterDot; &rho; 1 &prime; + &rho; 2 &prime; &CenterDot; &rho; 2 &prime; ,
Wherein, α ' 1, oldThe weight coefficient of previous frame gray feature, α ' 2, oldBe the weight coefficient of previous frame LBP textural characteristics, λ is scale-up factor.
CN201210397686.3A 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion Active CN102930558B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210397686.3A CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210397686.3A CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Publications (2)

Publication Number Publication Date
CN102930558A true CN102930558A (en) 2013-02-13
CN102930558B CN102930558B (en) 2015-04-01

Family

ID=47645348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210397686.3A Active CN102930558B (en) 2012-10-18 2012-10-18 Real-time tracking method for infrared image target with multi-feature fusion

Country Status (1)

Country Link
CN (1) CN102930558B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109215062A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 Motion capture method, binocular positioning device and system based on image vision
CN109902578A (en) * 2019-01-25 2019-06-18 南京理工大学 An infrared target detection and tracking method
CN110785775A (en) * 2017-07-07 2020-02-11 三星电子株式会社 System and method for optical tracking
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI628624B (en) * 2017-11-30 2018-07-01 國家中山科學研究院 Improved thermal image feature extraction method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590999B1 (en) * 2000-02-14 2003-07-08 Siemens Corporate Research, Inc. Real-time tracking of non-rigid objects using mean shift

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590999B1 (en) * 2000-02-14 2003-07-08 Siemens Corporate Research, Inc. Real-time tracking of non-rigid objects using mean shift

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪首坤,郭俊杰,王军政: "基于自适应特征融合的均值迁移目标跟踪", 《北京理工大学学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109215062A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 Motion capture method, binocular positioning device and system based on image vision
CN109215062B (en) * 2017-06-29 2022-02-08 沈阳新松机器人自动化股份有限公司 Motion capture method based on image vision, binocular positioning device and system
CN110785775A (en) * 2017-07-07 2020-02-11 三星电子株式会社 System and method for optical tracking
CN110785775B (en) * 2017-07-07 2023-12-01 三星电子株式会社 System and method for optical tracking
CN109902578A (en) * 2019-01-25 2019-06-18 南京理工大学 An infrared target detection and tracking method
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Also Published As

Publication number Publication date
CN102930558B (en) 2015-04-01

Similar Documents

Publication Publication Date Title
CN105335986B (en) Method for tracking target based on characteristic matching and MeanShift algorithm
Ye et al. Dynamic texture based smoke detection using Surfacelet transform and HMT model
CN104537673B (en) Infrared Image Segmentation based on multi thresholds and adaptive fuzzy clustering
CN106991686B (en) A kind of level set contour tracing method based on super-pixel optical flow field
CN102236901B (en) Method for tracking target based on graph theory cluster and color invariant space
CN105046717B (en) A kind of video object method for tracing object of robustness
CN110120041A (en) Pavement crack image detecting method
CN101916446A (en) Gray Target Tracking Algorithm Based on Edge Information and Mean Shift
CN102024156B (en) Method for positioning lip region in color face image
CN104933709A (en) Automatic random-walk CT lung parenchyma image segmentation method based on prior information
CN102298773B (en) Shape-adaptive non-local mean denoising method
CN103080979B (en) Systems and methods for synthesizing portrait sketches from photographs
CN104965199B (en) Radar video moving target Fusion Features decision method
CN107590496A (en) The association detection method of infrared small target under complex background
CN102930558A (en) Real-time tracking method for infrared image target with multi-feature fusion
CN104331885A (en) Circular target detection method based on voting line clustering
CN106874917A (en) A kind of conspicuousness object detection method based on Harris angle points
CN104599291B (en) Infrared motion target detection method based on structural similarity and significance analysis
CN106447662A (en) Combined distance based FCM image segmentation algorithm
CN104637045A (en) Image pixel labeling method based on super pixel level features
Bao et al. Solar panel segmentation under low contrast condition
CN108573236B (en) Infrared weak and small target detection method under cloudy sky background based on discrete fractional Brownian random field
CN108205814B (en) A method of generating black and white contours for color images
Sharma et al. Analysis of proposed hybrid approaches for laplacian edge based image segmentation using morphological image processing
CN108776968B (en) SAR image change detection method based on deep forest

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant