Summary of the invention
Goal of the invention: technical matters to be solved by this invention is for the deficiencies in the prior art, and a kind of infrared image object real-time tracking method of many Fusion Features is provided.
In order to solve the problems of the technologies described above, the invention discloses a kind of infrared image object real-time tracking method of many Fusion Features, may further comprise the steps:
(1) initialization tracking position position y
0Initial trace point is by artificial appointment;
(2) initialization object module is with initial trace point y
0Centered by set up target gray level model q
1With target LBP texture model q
2(local binary pattern, LBP) local binary patterns.
(3) calculate the target candidate model, according to the trace point position y of target
0, calculated candidate target gray level model p
1(y
0) and candidate target LBP texture model p
2(y
0);
(4) utilize gray feature Bhattacharyya(Batachelia Bhattacharyya, referring to Visual C++ Digital Image Processing, the 466th page, author: Xie Fengying, first published in 2008, Electronic Industry Press.) coefficient ρ
1Bhattacharyya coefficient ρ with the LBP textural characteristics
2, and the weight coefficient α of gray feature
1Weight coefficient α with the LBP textural characteristics
2, calculating location y
0The union feature Bhattacharyya of place coefficient ρ, expression formula is as follows:
ρ=α
1·ρ
1+α
2·ρ
2;
(5) calculate present frame target reposition y
1
(6) utilize gray feature Bhattacharyya coefficient ρ '
1With LBP textural characteristics Bhattacharyya coefficient ρ '
2, and the weight coefficient α ' of gray feature
1Weight coefficient α ' with the LBP textural characteristics
2, calculating location y
1The union feature Bhattacharyya of place coefficient ρ ', expression formula is as follows;
ρ=α′
1·ρ′
1+α′
2·ρ′
2,
(7) when ρ '<ρ,
Otherwise y
1Remain unchanged;
(8) if | (y
0-y
1) |<ε, stop to calculate, otherwise, with y
1Assignment is to y
0And execution in step (3), wherein, ε is the error constant coefficient.
In the step (2), target gray level model q
1For:
Target LBP texture model q
2For:
Wherein, q
uBe the probability density at different levels of target gray level model gray feature, q
vBe the probability density at different levels of LBP textural characteristics, m
1Be the maximum quantification progression scope of target gray level model gray feature, m
2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
In the step (3), candidate target gray level model p
1For:
Candidate target LBP texture model p
2For:
p
uBe the probability density at different levels of target gray level model gray feature, p
vBe the probability density at different levels of target gray level model gray feature and LBP textural characteristics, m
1Be the maximum quantification progression scope of target gray level model gray feature, m
2Be the maximum quantification progression scope of LBP textural characteristics, u represents grey level quantization progression, and v represents that texture quantizes progression.
The weight coefficient α of gray feature in the step (4)
1Weight coefficient α with the LBP textural characteristics
2, the weight coefficient α ' of gray feature in the step (6)
1Weight coefficient α ' with the LBP textural characteristics
2The employing iterative manner is upgraded, and calculating formula is as follows respectively:
α
1=(1-λ)·α
1,old+λ·α
1,cur,
α
2=(1-λ)·α
2,old+λ·α
2,cur,
α′
1=(1-λ)·α′
1,old+λ·α′
1,cur,
α′
2=(1-λ)·α′
2,old+λ·α′
2,cur,
Wherein, α
1, oldAnd α
2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (4) and LBP textural characteristics, α
1, curAnd α
2, curRespectively the weight coefficient of the middle present frame gray feature of step (4) and LBP textural characteristics, α '
1, oldAnd α '
2, oldRespectively the weight coefficient of the middle previous frame gray feature of step (6) and LBP textural characteristics, α '
1, curAnd α '
2, curBe the weight coefficient of the middle present frame gray feature of step (6) and LBP textural characteristics, λ is scale-up factor.0≤λ≤1, the speed of convergence of decision weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
The weight coefficient α of present frame gray feature in the step (4)
1, curWeight coefficient α with the LBP textural characteristics
2, cur, the weight coefficient α ' of present frame gray feature in the step (6)
1, curWeight coefficient α ' with the LBP textural characteristics
2, cur, calculating formula is as follows respectively:
Wherein, ρ
1The Bhattacharyya coefficient of gray feature in the step (4), ρ
2The Bhattacharyya coefficient of LBP textural characteristics in the step (4), ρ '
1The Bhattacharyya coefficient of gray feature in the step (6), ρ '
2It is the Bhattacharyya coefficient of LBP textural characteristics in the step (6).
Gray feature Bhattacharyya coefficient ρ ' in the step (6)
1Bhattacharyya coefficient ρ ' with the LBP textural characteristics
2, and the weight coefficient α ' of gray feature
1Weight coefficient α ' with the LBP textural characteristics
2Obtain by following formula:
P '
uPosition y
1The probability density at different levels of the target gray level model gray feature at place, p '
vPosition y
1The probability density at different levels of the LBP of place textural characteristics;
Wherein, α '
1, oldThe weight coefficient of previous frame gray feature, α '
2, oldBe the weight coefficient of previous frame LBP textural characteristics, λ is scale-up factor.0≤λ≤1, the speed of convergence of decision weight coefficient, λ value is larger, and speed of convergence is faster, and tracking maneuverability is stronger, and λ value is less, and speed of convergence is slower, and tracking stability is better.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, use Yi Pannieqiekefu (Epanechnikov) kernel function to calculate gray feature probability histogram and LBP textural characteristics probability histogram.
The present invention compared with prior art has following remarkable advantage: (1) according to characteristic remarkable and the similarity of target and background, self-adaptation is calculated the weight coefficient between many features, has strengthened the target following robustness; (2) the employing iterative manner is upgraded the weight coefficient between many features, has guaranteed target following stability; (3) utilize many Fusion Features method for tracking target that the infrared image target is followed the tracks of, solved the unstable trace point drifting problem that causes of single features, Effective Raise target tracking accuracy; (4) many Fusion Features method for tracking target of the present invention's proposition does not exist high exponent arithmetic(al) and labyrinth, and algorithm operation quantity is little, is easy to the hardware real-time implementation.
Embodiment
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, utilize gray feature and LBP textural characteristics to describe the infrared image clarification of objective.
Eight neighborhood LBP textural characteristics expression formula LBP
8,1As follows:
Wherein, g
cCurrent point, g
nNeighborhood point on every side, n=0..7.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, utilize the Bhattacharyya coefficient to describe similarity between object module and target candidate model.
Bhattacharyya coefficient ρ
BhaExpression formula is as follows:
Wherein, p is the target candidate model, and q is object module.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, the weight coefficient α of gray feature
1Weight coefficient α with the LBP textural characteristics
2The employing iterative manner is upgraded, and expression formula is as follows:
α
1=(1-λ)·α
1,old+λ·α
1,cur
α
2=(1-λ)·α
2,old+λ·α
2,cur
Wherein, α
1, oldAnd α
2, oldRespectively the weight coefficient of previous frame gray feature and LBP textural characteristics, α
1, curAnd α
2, curBe the weight coefficient of present frame gray feature and LBP textural characteristics, λ is scale-up factor.
In the infrared image object real-time tracking method of the many Fusion Features of the present invention, the weight coefficient α of present frame gray feature
1, curWeight coefficient α with the LBP textural characteristics
2, curExpression formula is as follows respectively:
Wherein, ρ
1The Bhattacharyya coefficient of gray feature, ρ
2It is the Bhattacharyya coefficient of LBP textural characteristics.
Embodiment 1
As shown in Figure 1, the below illustrates the infrared image object real-time tracking method of the many Fusion Features of the present invention with example.The number of pixels 320 * 240 of infrared image, frame frequency 25HZ.The thermal infrared imager imaging is by the special image disposable plates of optical fiber transmission to the DSP+FPGA framework, and the infrared image target following of many Fusion Features realizes in dsp processor, satisfies the demand of processing in real time, and the implementation step is as follows:
(1) initialization tracking position position y
0, initial trace point is by artificial appointment.
The artificial initial target trace point position (i, j) of specifying, i=80, j=100(is shown in Figure 2), set Yi Pannieqiekefu (Epanechnikov) kernel function bandwidth h=10.
(2) initialization object module is set up target gray level model q according to gray feature
1, the LBP textural characteristics of calculating target is set up target texture model q in conjunction with the LBP textural characteristics
2
Calculating is centered by initial trace point position (80,100), and bandwidth h=10 is the image I of scope
LBPThe LBP textural characteristics, expression formula is as follows:
Eight neighborhood LBP textural characteristics expression formula LBP
8,1As follows:
Wherein, g
cThe current goal point, c=k2*320+k1, g
nG
cNeighborhood point on every side, n=0..7.
Target gray level model q
1For:
Target LBP texture model q
2For:
Wherein, q
uAnd q
vThe probability density at different levels that represent respectively object module gray feature and LBP textural characteristics, m
1=255 and m
2=255 represent respectively the quantification progression of object module gray feature and LBP textural characteristics, function b
1() is to be positioned at x
iPixel to the reflection of gray feature index, function b
2() is to be positioned at x
iPixel to the reflection of LBP textural characteristics index, δ is the Delta function, C is normalization coefficient, μ=1...255, v=1...255.
(3) calculate the target candidate model.According to trace point position y
0, calculated candidate target gray level model p
1(y
0) and target texture candidate family p
2(y
0);
Candidate target gray level model p
1For:
Candidate target LBP texture model p
2For:
p
uAnd p
vThe probability density at different levels that represent respectively object module gray feature and LBP textural characteristics, m
1=255 and m
2=255 represent respectively the quantification progression of object module gray feature and LBP textural characteristics, function b
1() is to be positioned at x
iPixel to the reflection of gray feature index, function b
2() is to be positioned at x
iPixel to the reflection of LBP textural characteristics index, δ is the Delta function, C is normalization coefficient, μ=1...255, v=1...255.
(4) calculate respectively the Bhattacharyya coefficient ρ of gray feature and LBP textural characteristics
1, ρ
2And weight coefficient α
1, α
2, utilize ρ
1, α
1, ρ
2, α
2Calculating location y
0The union feature Bhattacharyya coefficient ρ at place;
Union feature Bhattacharyya coefficient ρ is described as:
ρ=α
1·ρ
1+α
2·ρ
2,
Gray feature weight coefficient α
1, LBP textural characteristics α
2The renewal expression formula is as follows:
Wherein, α
1, oldAnd α
2, oldThe previous frame weight coefficient, α
1, curAnd α
2, curBe the present frame weight coefficient, λ is scale-up factor.
(5) calculate present frame target following reposition y
1
Wherein, n' is candidate target pixel number, and h is the kernel function bandwidth, q
u, q
v, p
u, p
v, α
1, α
2, x
i, b
1(), b
2The implication of () is identical with the definition in step (2), (3), (4).
(6) utilize the Bhattacharyya coefficient ρ ' of gray feature and LBP textural characteristics
1, ρ '
2And weight coefficient α '
1, α '
2, calculating location y
1The union feature Bhattacharyya of place coefficient ρ ', expression formula is as follows;
ρ'=α′
1·ρ′
1+α′
2·ρ′
2,
Gray feature weight coefficient α '
1, LBP textural characteristics α '
2The renewal expression formula is as follows:
Wherein, α '
1, oldAnd α '
2, oldBe the previous frame weight coefficient, λ is scale-up factor.
(7) when ρ '<ρ,
Otherwise y
1Remain unchanged;
(8) if abs is (y
0-y
1)<0.01 item stops, otherwise, y
0← y
1, execution in step (3).
Fig. 2 is conventional art, and Fig. 3 is for the single feature (gray feature) of only utilization that obtains according to the present embodiment and the infrared image target following result of many Fusion Features, because be infrared image, so greyscale color unavoidably occurs.Wherein Fig. 2 a, 2b, 2c, 2d represent respectively the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame, and Fig. 3 a, 3b, 3c, 3d represent respectively the image of the 20th frame, the 80th frame, the 140th frame and the 200th frame.Comparison diagram 2 and Fig. 3 find: only utilize single feature that the target following meeting is caused that tracing process is unstable, tracking accuracy is poor, swings at random such as Fig. 2 tracking gate; Utilize the tracking of many Fusion Features can Effective Raise tracking accuracy, such as Fig. 3 tracking gate all the time near the object type heart.
The invention provides a kind of infrared image object real-time tracking method of many Fusion Features; method and the approach of this technical scheme of specific implementation are a lot; the above only is preferred implementation of the present invention; should be understood that; for those skilled in the art; under the prerequisite that does not break away from the principle of the invention, can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.In the present embodiment not clear and definite each ingredient all available prior art realized.