CN103424114B - A kind of full combined method of vision guided navigation/inertial navigation - Google Patents
A kind of full combined method of vision guided navigation/inertial navigation Download PDFInfo
- Publication number
- CN103424114B CN103424114B CN201210161621.9A CN201210161621A CN103424114B CN 103424114 B CN103424114 B CN 103424114B CN 201210161621 A CN201210161621 A CN 201210161621A CN 103424114 B CN103424114 B CN 103424114B
- Authority
- CN
- China
- Prior art keywords
- attitude
- navigation
- inertial navigation
- visual
- error
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000000007 visual effect Effects 0.000 claims abstract description 95
- 239000011159 matrix material Substances 0.000 claims abstract description 59
- 238000012937 correction Methods 0.000 claims abstract description 36
- 238000004364 calculation method Methods 0.000 claims abstract description 29
- 238000001914 filtration Methods 0.000 claims abstract description 9
- 238000005259 measurement Methods 0.000 claims description 31
- 230000001133 acceleration Effects 0.000 claims description 13
- 230000005484 gravity Effects 0.000 claims description 5
- 238000002271 resection Methods 0.000 claims description 5
- 238000000605 extraction Methods 0.000 claims description 4
- 238000000354 decomposition reaction Methods 0.000 claims description 3
- 239000007787 solid Substances 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 5
- 238000002474 experimental method Methods 0.000 description 9
- 230000004927 fusion Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Landscapes
- Navigation (AREA)
Abstract
本发明涉及一种视觉导航/惯性导航的全组合方法,该方法包括以下步骤:1)视觉导航解算:以共线方程为基础列立观测方程,采用最小二乘原理平差求取载体位置和姿态参数,并计算参数之间的方差-协方差阵;2)惯性导航计算:在当地水平坐标系中做导航计算,得到每个时刻载体的位置、速度和姿态参数,同时计算参数之间的方差-协方差阵;3)视觉系统修正惯导系统:以卡尔曼滤波为手段,估计出惯导系统的导航参数误差和器件误差,并进行补偿和反馈校正,从而得到惯导系统所有参数的最优估值;4)惯导系统修正视觉系统:采用序贯平差处理,修正视觉系统所有参数。与现有技术相比,本发明具有理论严密、性能稳定、高效等优点。
The invention relates to a fully combined method of visual navigation/inertial navigation, which comprises the following steps: 1) visual navigation solution: set up observation equations based on collinear equations, and use the least squares principle to adjust to obtain the carrier position and attitude parameters, and calculate the variance-covariance matrix between the parameters; 2) Inertial navigation calculation: do navigation calculation in the local horizontal coordinate system, get the position, velocity and attitude parameters of the carrier at each moment, and calculate the parameters between 3) Correction of the inertial navigation system by the visual system: by means of Kalman filtering, the navigation parameter error and device error of the inertial navigation system are estimated, and compensation and feedback correction are performed to obtain all the parameters of the inertial navigation system 4) The inertial navigation system corrects the visual system: adopts sequential adjustment processing to correct all parameters of the visual system. Compared with the prior art, the invention has the advantages of rigorous theory, stable performance, high efficiency and the like.
Description
技术领域 technical field
本发明涉及一种视觉导航系统和惯导系统的组合方式,尤其是涉及一种视觉导航/惯性导航的全组合方法。The invention relates to a combination method of a visual navigation system and an inertial navigation system, in particular to a fully combined method of visual navigation/inertial navigation.
背景技术 Background technique
视觉导航,又称为视觉测程法(VisualOdometry,VO),基本原理是:载体前进过程中,利用立体相机获取周围场景的影像,依靠特征点在左右影像间的匹配,以及前后帧影像间的追踪,以共线方程为基础列立观测方程,根据最小二乘原理平差求取相机的空间位置和姿态。视觉导航是一种高精度的自主导航方式,但是作为一种航迹推算方式,其导航误差随时间积累,而且算法的稳定性一直是个难题,有时候影像视场中的背景单一或过于相似,或者载体车的动态性过大,例如转弯角度过大,使得相邻影像之间的重叠度太小,这些都会导致视觉导航解算失败,因此要借助其它手段来增强算法的稳定性。Visual navigation, also known as visual odometry (VO), the basic principle is: during the forward process of the carrier, use the stereo camera to obtain the image of the surrounding scene, rely on the matching of feature points between the left and right images, and the relationship between the front and rear frame images. Tracking, based on the collinear equations, set up the observation equations, and calculate the spatial position and attitude of the camera according to the least squares principle adjustment. Visual navigation is a high-precision autonomous navigation method, but as a dead reckoning method, its navigation error accumulates over time, and the stability of the algorithm has always been a problem. Sometimes the background in the image field of view is single or too similar, Or the dynamics of the carrier vehicle are too large, for example, the turning angle is too large, so that the overlap between adjacent images is too small, which will lead to the failure of the visual navigation solution, so other means should be used to enhance the stability of the algorithm.
惯导系统(InertialNavigationSystem,INS)以牛顿力学定律为工作原理,利用安装在载体上的惯性测量元件(InertialMeasurementUnit,IMU)来敏感载体的角速度和加速度信息,通过积分运算得到载体的位置、速度和姿态等导航参数。惯性导航具有完全的自主性和抗干扰性,短时间内具有高精度,但是由于积分原理,其误差随时间累积,因此长时间工作的精度较差。为了减少误差累积,要引入外部观测值来对惯导系统进行修正。The inertial navigation system (Inertial Navigation System, INS) uses Newton's law of mechanics as the working principle, uses the inertial measurement unit (Inertial Measurement Unit, IMU) installed on the carrier to sense the angular velocity and acceleration information of the carrier, and obtains the position, velocity and attitude of the carrier through integral operations and other navigation parameters. Inertial navigation has complete autonomy and anti-interference, and has high precision in a short period of time, but due to the integration principle, its error accumulates over time, so the accuracy of long-term work is poor. In order to reduce the accumulation of errors, external observations should be introduced to correct the inertial navigation system.
因此,视觉导航与惯性导航作为两种自主导航方式,具有很好的互补性。两者的组合可以解决无GPS环境下的导航问题,对于室内导航、机器人路径规划、无人机自主着陆,自动行驶车的导航、地下交通导航定位、矿井作业安全等应用需求具有重要意义。Therefore, visual navigation and inertial navigation, as two autonomous navigation methods, are very complementary. The combination of the two can solve the navigation problem in a non-GPS environment. It is of great significance for indoor navigation, robot path planning, drone autonomous landing, automatic vehicle navigation, underground traffic navigation and positioning, mine operation safety and other application requirements.
目前,视觉导航与惯导系统的组合方式一般有三类:(1)结合型,一般依靠视觉系统提供载体的位置信息,惯导系统提供载体的姿态信息;(2)修正型,这种组合方式是以一种导航系统去修正另一种导航系统;(3)融合型,一般采用卡尔曼滤波来实现两者的数据融合。以上三类组合方式中,结合型属于低层次的组合,常用于精度要求不高的场合,而在高精度的场合中通常采用修正型或融合型。At present, there are generally three types of combination of visual navigation and inertial navigation system: (1) combined type, generally relying on the visual system to provide the position information of the carrier, and the inertial navigation system to provide the attitude information of the carrier; (2) modified type, this type of combination One navigation system is used to correct another navigation system; (3) Fusion type, generally Kalman filter is used to realize the data fusion of the two. Among the above three types of combinations, the combined type is a low-level combination, which is often used in occasions where the accuracy is not high, while the correction type or fusion type is usually used in high-precision occasions.
在修正型组合方式中,一般是利用惯导系统来修正视觉系统,实现过程一般是这样的:(1)相机获取当前时刻的影像,(2)对当前时刻的左右影像进行特征点提取和匹配,并与前一时刻的影像进行匹配,然后通过后方交会计算相机的外方位元素(位置和姿态),(3)利用IMU的输出数据计算载体的俯仰角和翻滚角,以及航向角增量,(4)利用载体的俯仰角和翻滚角以及航向角增量,通过卡尔曼滤波对相机位置和姿态进行修正,(5)利用修正后的相机位置和姿态,通过前方交会求特征点的物方坐标,(6)重复以上步骤。In the corrected combination method, the inertial navigation system is generally used to correct the visual system. The implementation process is generally as follows: (1) the camera acquires the image at the current moment, (2) extracts and matches the feature points of the left and right images at the current moment , and match with the image at the previous moment, and then calculate the outer orientation elements (position and attitude) of the camera through resection, (3) use the output data of the IMU to calculate the pitch angle and roll angle of the carrier, and the heading angle increment, (4) Use the pitch angle, roll angle and heading angle increment of the carrier to correct the camera position and attitude through the Kalman filter. (5) Use the corrected camera position and attitude to find the object space of the feature point through the forward intersection Coordinates, (6) Repeat the above steps.
融合型组合方式的实现过程一般是这样的:(1)相机获取当前时刻的影像,(2)对当前时刻的左右影像进行特征点提取和匹配,并与前一时刻的影像进行匹配,然后通过后方交会计算相机的外方位元素(位置和姿态),(3)利用相机的位置和姿态作为量测值,采用卡尔曼滤波估计IMU的位置和姿态,(4)将滤波后得到的IMU位置和姿态传递给相机,然后通过前方交会求特征点的物方坐标,(5)重复以上步骤。The implementation process of the fusion combination method is generally as follows: (1) the camera acquires the image at the current moment, (2) extracts and matches the feature points of the left and right images at the current moment, and matches with the image at the previous moment, and then passes Resection calculates the outer orientation elements (position and attitude) of the camera, (3) uses the position and attitude of the camera as the measurement value, uses Kalman filtering to estimate the position and attitude of the IMU, (4) combines the filtered IMU position and The attitude is passed to the camera, and then the object space coordinates of the feature points are obtained through the forward intersection. (5) Repeat the above steps.
从以上可以看出,修正型的组合方式中,仅仅利用IMU的输出数据提供姿态,而没提供位置信息,因此这种方式没能最大程度上的利用惯导系统所提供的导航信息。融合型的组合方式能较好的利用视觉系统和惯导系统的信息,使得两者都可以得到修正,但是这种方式没有考虑到影像特征点物方坐标的先验信息,组合效果不能达到最佳。实际上在视觉导航中,特征点在不同影像之间的传递是关键,当前影像中提取的特征点,大部分在上一帧影像中已经通过最小二乘平差获得了其物方坐标,且坐标精度也已知,因此必须对精度信息加以考虑。It can be seen from the above that in the modified combination method, only the output data of the IMU is used to provide the attitude, but not the position information. Therefore, this method fails to make the most of the navigation information provided by the inertial navigation system. The fusion type combination method can make better use of the information of the visual system and the inertial navigation system, so that both can be corrected, but this method does not take into account the prior information of the object coordinates of the image feature points, and the combination effect cannot achieve the best results. good. In fact, in visual navigation, the transfer of feature points between different images is the key. Most of the feature points extracted in the current image have obtained their object space coordinates through the least squares adjustment in the previous frame of image, and The coordinate accuracy is also known, so the accuracy information must be taken into account.
发明内容 Contents of the invention
本发明的目的就是为了克服上述现有技术存在的缺陷而提供一种理论严密、稳定、高效的视觉导航/惯性导航的全组合方法。The object of the present invention is to provide a fully combined method of visual navigation/inertial navigation with rigorous theory, stable and high efficiency in order to overcome the above-mentioned defects in the prior art.
本发明的目的可以通过以下技术方案来实现:The purpose of the present invention can be achieved through the following technical solutions:
一种视觉导航/惯性导航的全组合方法,该方法包括以下步骤:A full combination method of visual navigation/inertial navigation, the method comprises the following steps:
1)视觉导航解算:相机在载体运动过程中获取影像,以共线方程为基础列立观测方程,采用最小二乘原理平差求取载体位置和姿态参数Xvo,并计算参数之间的方差-协方差阵P(Xvo);1) Visual navigation calculation: the camera acquires images during the movement of the carrier, sets up the observation equations based on the collinear equation, uses the least squares principle to adjust to obtain the carrier position and attitude parameters X vo , and calculates the relationship between the parameters Variance-covariance matrix P(X vo );
2)惯性导航计算:在当地水平坐标系中做导航计算,得到每个时刻载体的位置、速度和姿态参数Xins,同时根据惯导系统的误差方程,按协方差传播律,计算参数之间的方差-协方差阵P(Xins);2) Inertial navigation calculation: do navigation calculation in the local horizontal coordinate system, obtain the position, velocity and attitude parameters X ins of the carrier at each moment, and calculate the relationship between the parameters according to the error equation of the inertial navigation system and the law of covariance propagation The variance-covariance matrix P(X ins );
3)视觉系统修正惯导系统:以卡尔曼滤波为手段,取惯导系统的误差方程作为状态方程,利用视觉导航解算得到的载体位置、姿态,与惯导系统计算得到的载体位置、姿态分别求差,作为卡尔曼滤波的量测,估计出惯导系统的导航参数误差和器件误差,并进行补偿和反馈校正,从而得到惯导系统所有参数的最优估值,记为 3) The visual system corrects the inertial navigation system: using Kalman filter as a means, the error equation of the inertial navigation system is taken as the state equation, and the carrier position and attitude obtained by the visual navigation solution are compared with the carrier position and attitude calculated by the inertial navigation system. Calculate the difference separately, as the measurement of Kalman filter, estimate the navigation parameter error and device error of the inertial navigation system, and perform compensation and feedback correction, so as to obtain the optimal estimation of all parameters of the inertial navigation system, denoted as
4)惯导系统修正视觉系统:从惯导系统的参数中分离出IMU的位置和姿态,考虑相机与IMU之间的几何关系,将IMU的位置和姿态转化成相机的位置和姿态信息,建立观测方程,采用序贯平差处理,得到视觉系统所有参数的最优估值,记为 4) Inertial navigation system correction vision system: separate the position and attitude of the IMU from the parameters of the inertial navigation system, consider the geometric relationship between the camera and the IMU, convert the position and attitude of the IMU into the position and attitude information of the camera, and establish The observation equation is processed by sequential adjustment to obtain the optimal estimation of all parameters of the vision system, which is denoted as
所述的视觉导航解算具体步骤包括:The specific steps of the visual navigation solution include:
11)相机外方位元素的初始化:在视觉导航起始位置的周围布设控制点,通过后方交会求取相机的外方位元素;11) Initialization of the camera's outer orientation element: Arrange control points around the starting position of the visual navigation, and obtain the camera's outer orientation element through resection;
12)载体不停的运动,利用相机获取当前时刻的影像,进行特征点提取和匹配;12) The carrier is constantly moving, and the camera is used to obtain the image at the current moment, and the feature points are extracted and matched;
13)以共线方程为基础,对左右影像匹配成功的所有特征点都建立观测方程:13) Based on the collinear equation, the observation equation is established for all the feature points of the left and right image matching success:
上式中,指左相机的外方位元素改正数,指右相机的外方位元素改正数,指已知点的物方坐标改正数,指待定点的物方坐标改正数,y1指已知点的物方坐标,v指观测值残差,l为常量,为左右相机之间的约束关系,Wx为约束方程中的常量;In the above formula, Refers to the correction number of the outer orientation element of the left camera, Refers to the correction number of the outer orientation element of the right camera, Refers to the object space coordinate correction number of a known point, Refers to the object space coordinate correction number of the point to be determined, y 1 refers to the object space coordinate of the known point, v refers to the residual error of the observation value, l is a constant, is the constraint relationship between the left and right cameras, and W x is a constant in the constraint equation;
14)根据最小二乘原理求解上式,求得左右相机的外方位元素,以及所有特征点的物方坐标。14) Solve the above formula according to the principle of least squares, and obtain the outer orientation elements of the left and right cameras, and the object space coordinates of all feature points.
所述的特征点匹配包括两个方面:一是将当前时刻的左右影像进行特征点匹配,二是将当前时刻的左影像与上一时刻的左影像进行特征点匹配;所述的特征点包括已知点和待定点:已知点是指在上一帧影像中已经出现过的特征点,待定点则是指当前影像中新增加的特征点。The feature point matching includes two aspects: one is to perform feature point matching on the left and right images at the current moment, and the other is to perform feature point matching on the left image at the current moment and the left image at the previous moment; the feature points include Known points and pending points: Known points refer to feature points that have appeared in the previous frame of image, and pending points refer to newly added feature points in the current image.
所述的惯性导航计算步骤具体包括:Described inertial navigation calculation step specifically comprises:
(1)姿态更新:利用陀螺输出的角速度观测值,采用四元数法作姿态更新,然后将更新后的四元数转化成姿态矩阵,记为k表示上一时刻,k+1表示当前时刻,L为导航坐标系,b为载体坐标系;(1) Attitude update: Use the angular velocity observation value output by the gyroscope to update the attitude by using the quaternion method, and then convert the updated quaternion into an attitude matrix, which is denoted as k represents the previous moment, k+1 represents the current moment, L is the navigation coordinate system, and b is the carrier coordinate system;
(2)姿态提取:根据姿态矩阵中元素的值,计算载体的姿态,包括航向角、俯仰角、翻滚角;(2) Attitude extraction: according to the attitude matrix The value of the element in calculates the attitude of the carrier, including the heading angle, pitch angle, and roll angle;
(3)比力分解:利用姿态矩阵,将加速度计输出的比力观测值fb由载体坐标系转换到导航坐标系,如下所示(3) Specific force decomposition: using the attitude matrix, the specific force observation value f b output by the accelerometer is converted from the carrier coordinate system to the navigation coordinate system, as shown below
(4)有害加速度的补偿:对fL进行哥氏加速度修正和正常重力补偿,得到载体在地球表面的加速度a,如下所示(4) Compensation of harmful acceleration: Coriolis acceleration correction and normal gravity compensation are performed on f L to obtain the acceleration a of the carrier on the earth's surface, as shown below
其中,为地球自转角速度三维分量构成的反对称阵,为L系相对于地固系旋转角速度三维分量构成的反对称阵,vL为速度矢量,gL为地球重力矢量;in, is the antisymmetric matrix formed by the three-dimensional components of the earth’s rotation angular velocity, is the antisymmetric matrix formed by the three-dimensional components of the rotational angular velocity of the L system relative to the solid system, v L is the velocity vector, and g L is the earth's gravity vector;
(5)积分求取速度:利用加速度进行积分得到速度增量,进而得到当前时刻的速度如下所示(5) Integrate to obtain speed: use acceleration to integrate to obtain speed increment, and then obtain the speed at the current moment As follows
(6)积分求取位置:利用速度进行积分得到位置增量,进而得到当前时刻的位置如下所示(6) Integrate to obtain the position: use the speed to integrate to obtain the position increment, and then obtain the current position As follows
所述的视觉系统修正惯导系统步骤具体包括:The step of correcting the inertial navigation system by the visual system specifically includes:
31)取惯导系统的误差方程作为卡尔曼滤波器的状态方程,形式如下31) Take the error equation of the inertial navigation system as the state equation of the Kalman filter, the form is as follows
其中,X为误差状态向量,包括位置误差δrL、速度误差δvL、平台失准角δεL、陀螺漂移d和加速度计零偏b,Among them, X is the error state vector, including position error δr L , velocity error δv L , platform misalignment angle δε L , gyro drift d and accelerometer bias b,
X=(δrL,δvL,δεL,d,b)X=(δr L , δv L , δε L , d, b)
32)将惯性导航计算得到的位置与视觉导航解算的位置之差,以及惯性导航计算得到的姿态与视觉导航解算的姿态之差作为量测值,量测方程如下32) Take the difference between the position calculated by inertial navigation and the position calculated by visual navigation, and the difference between the attitude calculated by inertial navigation and the attitude calculated by visual navigation as the measurement value, and the measurement equation is as follows
Zr,Za分别表示位置误差量测和姿态误差量测,Hr,Ha分别表示位置误差量测矩阵和姿态误差量测矩阵,Vr,Va分别表示位置误差量测和姿态误差量测的残差;Z r , Z a represent position error measurement and attitude error measurement respectively, H r , H a represent position error measurement matrix and attitude error measurement matrix respectively, V r , V a represent position error measurement and attitude error respectively measurement residuals;
位置误差Zr按如下方式计算The position error Zr is calculated as follows
Zr=rins-rvo Z r =r ins -r vo
上式中,rins指惯性导航计算得到的坐标,rvo表示视觉导航解算得到的坐标;位置误差量测矩阵Hr的形式如下In the above formula, r ins refers to the coordinates obtained by inertial navigation, and r vo represents the coordinates obtained by visual navigation; the position error measurement matrix H r has the following form
式中,I为单位阵,0为零矩阵,姿态误差Za的计算方式如下In the formula, I is the unit matrix, 0 is the zero matrix, and the attitude error Z a is calculated as follows
Za=ains-avo Z a =a ins -a vo
其中ains表示惯性导航计算得到的姿态,avo表示视觉导航解算得到的姿态;考虑相机与IMU之间的旋转矩阵,将视觉导航解算得到的相机姿态归算到载体的姿态:Where a ins represents the attitude calculated by inertial navigation, and a vo represents the attitude obtained by visual navigation calculation; considering the rotation matrix between the camera and IMU, the camera attitude obtained by visual navigation calculation is attributed to the attitude of the carrier:
avo=(p′,r′,y′)T a vo = (p', r', y') T
式中p′,r′,y′表示由相机姿态归算得到的载体俯仰角、翻滚角和航向角;姿态误差量测矩阵Ha的形式如下In the formula, p', r', y' represent the pitch angle, roll angle and heading angle of the carrier obtained from the camera attitude reduction; the form of the attitude error measurement matrix H a is as follows
A阵的形式如下The form of the A array is as follows
式中,p,r,y分别表示由惯性导航计算得到的载体俯仰角、翻滚角和航向角;In the formula, p, r, y represent the pitch angle, roll angle and heading angle of the carrier calculated by inertial navigation, respectively;
33)通过滤波估计出惯导系统的导航参数误差和器件误差,并进行补偿和反馈校正。33) Estimate the navigation parameter error and device error of the inertial navigation system through filtering, and perform compensation and feedback correction.
所述的惯导系统修正视觉系统步骤具体包括:The steps of correcting the visual system by the inertial navigation system specifically include:
41)从导航参数中分离出IMU的位置和姿态,根据相机与IMU之间的几何关系,将IMU的位置和姿态转化成相机位置和姿态的观测值;41) Separate the position and attitude of the IMU from the navigation parameters, and convert the position and attitude of the IMU into observations of the camera position and attitude according to the geometric relationship between the camera and the IMU;
42)采用序贯平差处理,完成惯导系统对视觉系统的修正。42) Use sequential adjustment to complete the correction of the visual system by the inertial navigation system.
所述的步骤42)具体包括:Described step 42) specifically comprises:
惯导系统修正视觉系统的函数模型写成以下形式:The functional model of the inertial navigation system correction vision system is written in the following form:
其中,分别为两个方程中对应的常量。in, are the corresponding constants in the two equations, respectively.
将上式整理成观测方程的形式,如下所示:Put the above formula into the form of observation equation, as follows:
上式中,矩阵B的形式如下In the above formula, the form of matrix B is as follows
B=(I000)B=(I000)
观测值的随机模型,即方差-协方差阵如下:The stochastic model of the observations, the variance-covariance matrix, is as follows:
上式中,P(rins,ains)为由P(Xins)中分离出的与位置、姿态相关的子阵,根据上述的函数模型和随机模型,可以导出视觉系统的修正公式,如下所示In the above formula, P(r ins , a ins ) is a sub-array related to position and attitude separated from P(X ins ). According to the above function model and stochastic model, the correction formula of the vision system can be derived, as follows shown
上式中,xvo是指由视觉导航解算得到的视觉参数,经过修正后的视觉参数,是经修正后的方差-协方差阵。In the above formula, x vo refers to the visual parameters obtained by the visual navigation solution, The corrected visual parameters, is the modified variance-covariance matrix.
与现有技术相比,本发明具有以下几个优点:Compared with the prior art, the present invention has the following advantages:
(1)视觉系统修正惯导系统。通过卡尔曼滤波不仅可以估计出惯导系统的导航参数误差(位置误差、速度误差、姿态误差),还可估计出器件误差(陀螺漂移、加速度计零偏),并对导航参数误差进行补偿,将器件误差反馈到下一个周期的导航计算中,这样可以使惯导系统保持长时间的高精度。在视觉导航解算失败的情况下,组合系统则完全依靠惯性导航来提供载体的导航参数,由于之前对各种误差进行了估计和校正,因此在短时间内,不会因为视觉导航解算的失败而导致整个系统的精度明显下降。(1) The visual system corrects the inertial navigation system. Through the Kalman filter, not only the navigation parameter error (position error, velocity error, attitude error) of the inertial navigation system can be estimated, but also the device error (gyro drift, accelerometer zero bias) can be estimated, and the navigation parameter error can be compensated. The device error is fed back into the navigation calculation of the next cycle, so that the inertial navigation system can maintain high precision for a long time. In the case of failure of visual navigation solution, the integrated system completely relies on inertial navigation to provide the navigation parameters of the carrier. Since various errors have been estimated and corrected before, in a short period of time, there will be no problem due to the failure of visual navigation solution. Failures lead to a significant drop in the accuracy of the entire system.
(2)惯导系统修正视觉系统。利用惯性导航计算得到的位置和姿态,通过序贯平差处理,不仅可以改善相机的外方位元素,影像特征点的物方坐标也得到修正。经过序贯平差修正后的影像特征点的物方坐标以及精度信息(物方坐标之间的方差-协方差阵)将传递到下一帧影像的导航解算中,采用最小二乘配置进行处理,这种方法顾及了影像特征点的物方坐标先验信息,可以使视觉系统保持长时间的高精度。(2) The inertial navigation system corrects the vision system. Using the position and attitude calculated by inertial navigation, through sequential adjustment processing, not only the external orientation elements of the camera can be improved, but also the object space coordinates of image feature points can be corrected. The object space coordinates and accuracy information (variance-covariance matrix between object space coordinates) of the image feature points corrected by the sequential adjustment will be transferred to the navigation solution of the next frame image, and the least squares configuration is used to carry out Processing, this method takes into account the prior information of the object space coordinates of the image feature points, which can make the vision system maintain high precision for a long time.
附图说明 Description of drawings
图1为本发明的步骤流程示意图;Fig. 1 is a schematic flow chart of the steps of the present invention;
图2为视觉系统修正惯导系统的原理图;Figure 2 is a schematic diagram of the vision system to correct the inertial navigation system;
图3为惯导系统修正视觉系统原理图;Fig. 3 is a schematic diagram of the correction vision system of the inertial navigation system;
图4为不同导航方法计算的载体运行轨迹;Figure 4 is the carrier trajectory calculated by different navigation methods;
图5为卡尔曼滤波过程中的位置新息;Figure 5 is the location innovation in the Kalman filtering process;
图6为卡尔曼滤波过程中的姿态新息。Figure 6 shows the attitude innovation during the Kalman filtering process.
具体实施方式 detailed description
下面结合附图和具体实施例对本发明进行详细说明。The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.
实施例Example
如图1所示,一种视觉导航/惯性导航的全组合方法,该方法包括以下步骤:As shown in Figure 1, a fully combined method of visual navigation/inertial navigation, the method includes the following steps:
1)视觉导航解算1) Visual navigation solution
步骤如下:Proceed as follows:
(1)相机外方位元素的初始化,一般在视觉导航起始位置的周围布设一些控制点,通过后方交会求取相机的外方位元素。(1) Initialization of the camera's outer orientation elements. Generally, some control points are arranged around the starting position of the visual navigation, and the camera's outer orientation elements are obtained through resection.
(2)载体不停的运动,利用立体相机获取当前时刻的影像,进行特征点提取和匹配,特征点匹配包括两个方面:一是将当前时刻的左右影像进行特征点匹配,二是将当前时刻的左影像与上一时刻的左影像进行特征点匹配。(2) The carrier is constantly moving, and the stereo camera is used to obtain the image at the current moment for feature point extraction and matching. The feature point matching includes two aspects: one is to match the feature points of the left and right images at the current moment, and the other is to match the current The left image at the moment is matched with the left image at the previous moment.
(3)以共线方程为基础,对左右影像匹配成功的所有特征点都建立观测方程。这里的特征点分为两类:已知点和待定点。已知点是指在上一帧影像中已经出现过的特征点,因此其物方坐标已知,且相应的精度也已知,待定点则是指当前影像中新增加的特征点,其物方坐标没有任何先验信息。本发明中将已知点的先验信息用最小二乘配置原理来对待,因此观测方程可以写成以下形式:(3) Based on the collinear equation, the observation equation is established for all the feature points of the left and right images that are successfully matched. The feature points here are divided into two categories: known points and pending points. Known points refer to the feature points that have appeared in the previous frame of image, so their object space coordinates are known, and the corresponding accuracy is also known. Undetermined points refer to the newly added feature points in the current image. Square coordinates do not have any prior information. In the present invention, the prior information of known points is treated with the principle of least squares configuration, so the observation equation can be written in the following form:
上式中共包含三个方程。第一个方程中的指左相机的外方位元素改正数,指右相机的外方位元素改正数,指已知点的物方坐标改正数,指待定点的物方坐标改正数,y1指已知点的物方坐标,v指观测值残差,l指常量。第二个方程是根据已知点的先验信息建立起来的虚拟观测方程。第三个方程是左右相机之间的约束方程,这个方程是通过实验前对相机与IMU之间的几何关系进行检校而建立起来的。The above formula contains three equations in total. in the first equation Refers to the correction number of the outer orientation element of the left camera, Refers to the correction number of the outer orientation element of the right camera, Refers to the object space coordinate correction number of a known point, Refers to the object space coordinate correction number of the point to be determined, y 1 refers to the object space coordinate of the known point, v refers to the residual error of the observation value, and l refers to the constant. The second equation is a virtual observation equation established based on prior information of known points. The third equation is the constraint equation between the left and right cameras. This equation is established by checking the geometric relationship between the camera and the IMU before the experiment.
(4)根据最小二乘原理求解式(1),实际上,式(1)是带约束的间接平差模型,因此可以通过平差求得左右相机的外方位元素,以及所有特征点的物方坐标。(4) Solve formula (1) according to the principle of least squares. In fact, formula (1) is an indirect adjustment model with constraints, so the external orientation elements of the left and right cameras, and the objects of all feature points can be obtained through adjustment. square coordinates.
2)惯性导航计算2) Inertial navigation calculation
选取当地水平坐标系(L系)作为导航系,L系的三轴指向为“东、北、天”,载体系(b系)的三轴指向为“右、前、上”。导航计算的步骤如下:The local horizontal coordinate system (L system) is selected as the navigation system, the three axes of the L system point to "east, north, sky", and the three axes of the carrier system (b system) point to "right, front, up". The steps of navigation calculation are as follows:
(1)姿态更新。利用陀螺输出的角速度观测值,采用四元数法作姿态更新,然后将更新后的四元数转化成姿态矩阵,记为 (1) Attitude update. Using the angular velocity observation value output by the gyroscope, the attitude is updated using the quaternion method, and then the updated quaternion is converted into an attitude matrix, which is denoted as
(2)姿态提取。根据姿态矩阵中元素的值,计算姿态的姿态,包括航向角、俯仰角、翻滚角。(2) Pose extraction. According to the pose matrix The value of the element in is used to calculate the attitude of the attitude, including the heading angle, pitch angle, and roll angle.
(3)比力分解。利用姿态矩阵,将加速度计输出的比力观测值fb由载体坐标系转换到导航坐标系,如下所示(3) Relative decomposition. Using the attitude matrix, the specific force observation value f b output by the accelerometer is converted from the carrier coordinate system to the navigation coordinate system, as shown below
(4)有害加速度的补偿。对fL进行哥氏加速度修正和正常重力补偿,得到载体在地球表面的加速度,如下所示(4) Compensation for harmful acceleration. Coriolis acceleration correction and normal gravity compensation are performed on f L to obtain the acceleration of the carrier on the earth's surface, as shown below
(5)积分求取速度。利用加速度进行积分得到速度增量,进而得到当前时刻的速度,如下所示(5) Calculate the speed by integral. Use the acceleration to integrate to get the speed increment, and then get the speed at the current moment, as shown below
(6)积分求取位置。利用速度进行积分得到位置增量,进而得到当前时刻的位置,如下所示(6) Integral to obtain the position. Use the speed to integrate to get the position increment, and then get the current position, as shown below
3)视觉系统修正惯导系统3) The visual system corrects the inertial navigation system
视觉系统修正惯导系统通过卡尔曼滤波来实现,取惯导系统的误差方程作为状态方程,图2显示了其原理和过程,从图中可以看出,系统的量测值是惯性导航计算得到的位置与视觉导航解算的位置之差,以及惯性导航计算得到的姿态与视觉导航解算的姿态之差。The visual system corrects the inertial navigation system through Kalman filtering. The error equation of the inertial navigation system is taken as the state equation. Figure 2 shows its principle and process. It can be seen from the figure that the measured value of the system is calculated by inertial navigation The difference between the position and the position calculated by visual navigation, and the difference between the attitude calculated by inertial navigation and the attitude calculated by visual navigation.
取15阶的误差状态向量,包括位置误差、速度误差、平台失准角、陀螺漂移、加速度计零偏。如下所示Take the 15th-order error state vector, including position error, velocity error, platform misalignment angle, gyro drift, and accelerometer zero bias. As follows
X=(δrL,δvL,δεL,d,b)(6)X=(δr L , δv L , δε L , d, b) (6)
状态方程取惯导系统的误差方程,形式如下The state equation takes the error equation of the inertial navigation system, and the form is as follows
系统的外部量测信息来自视觉导航所获取的相机位置和姿态。因此量测方程可以写出如下形式The external measurement information of the system comes from the camera position and attitude acquired by visual navigation. So the measurement equation can be written as follows
上式中Zr,Za分别表示位置误差量测和姿态误差量测。位置误差Zr按如下方式计算In the above formula, Z r and Z a represent position error measurement and attitude error measurement respectively. The position error Zr is calculated as follows
Zr=rins-rvo(9)Z r =r ins -r vo (9)
上式中,rins指INS导航计算得到的坐标,rvo表示视觉导航解算得到的坐标。由于视觉导航所确定的坐标是相机投影中心的坐标,而INS导航计算得到的是IMU几何中心的坐标,因此需要对rvo进行转化,扣除相机与IMU之间的三维偏置,从而将相机投影中心的坐标归算到IMU几何中心。In the above formula, r ins refers to the coordinates obtained by INS navigation calculation, and r vo refers to the coordinates obtained by visual navigation solution. Since the coordinates determined by the visual navigation are the coordinates of the camera projection center, and the INS navigation calculations are the coordinates of the geometric center of the IMU, it is necessary to convert r vo to deduct the three-dimensional offset between the camera and the IMU, so that the camera projection The coordinates of the center are reduced to the geometric center of the IMU.
位置误差量测矩阵Hr的形式如下The form of the position error measurement matrix H r is as follows
式中,I为单位阵,0为零矩阵。姿态误差Za的计算方式如下In the formula, I is an identity matrix, and 0 is a zero matrix. The attitude error Z a is calculated as follows
Za=ains-avo(11)Z a =a ins -a vo (11)
其中ains表示INS导航计算得到的姿态,avo表示视觉导航解算得到的姿态。这里要注意的是,INS导航计算得到的姿态是载体的姿态,而视觉导航解算得到的是相机的姿态,确切的说是指像空间坐标系到世界坐标系的三个旋转欧拉角,因此要考虑相机与IMU之间的旋转矩阵,将相机的姿态归算到载体的姿态。Where a ins represents the attitude calculated by INS navigation, and a vo represents the attitude obtained by visual navigation solution. It should be noted here that the attitude calculated by INS navigation is the attitude of the carrier, while the attitude obtained by visual navigation is the attitude of the camera. To be precise, it refers to the three rotation Euler angles from the image space coordinate system to the world coordinate system. Therefore, the rotation matrix between the camera and the IMU should be considered, and the pose of the camera should be calculated to the pose of the carrier.
avo=(p′,r′,y′)T(12) avo = (p', r', y') T (12)
式中p′,r′,y′表示由相机姿态归算得到的载体俯仰角、翻滚角和航向角。姿态误差量测矩阵Ha的形式如下In the formula, p', r', y' represent the pitch angle, roll angle and heading angle of the carrier obtained from the camera attitude reduction. The form of attitude error measurement matrix H a is as follows
A阵的形式如下The form of the A array is as follows
式中,p,r,y分别表示由惯性导航计算得到的载体俯仰角、翻滚角和航向角。至此,卡尔曼滤波的函数模型已经确立,通过滤波可以估计出惯导系统的导航参数误差和器件误差,并进行补偿和反馈校正,从而使惯导系统保持长时间的高精度。In the formula, p, r, y represent the pitch angle, roll angle and heading angle of the carrier calculated by inertial navigation, respectively. So far, the function model of Kalman filter has been established. Through filtering, the navigation parameter error and device error of the inertial navigation system can be estimated, and compensation and feedback correction can be performed, so that the inertial navigation system can maintain high precision for a long time.
4)惯导系统修正视觉系统4) The inertial navigation system corrects the vision system
惯导系统修正视觉系统采用序贯平差来实现,图3显示了其原理和过程。从图中可以看出,通过惯性导航计算,可以得到每个时刻的导航参数,同时根据惯导系统的误差方程,按协方差传播律,可以计算出参数之间的方差-协方差阵。然后从导航参数中分离出位置和姿态,根据相机与IMU之间的几何关系,转化成相机位置和姿态的观测值,采用序贯平差处理,实现视觉系统导航参数的优化。The inertial navigation system correction vision system is realized by sequential adjustment. Figure 3 shows its principle and process. It can be seen from the figure that the navigation parameters at each moment can be obtained through the inertial navigation calculation, and at the same time, according to the error equation of the inertial navigation system, according to the covariance propagation law, the variance-covariance matrix between the parameters can be calculated. Then the position and attitude are separated from the navigation parameters. According to the geometric relationship between the camera and the IMU, it is converted into the observation value of the camera position and attitude. Sequential adjustment is used to optimize the navigation parameters of the vision system.
序贯平差一般用于数据分期的情况,也就是说在不同的时刻对系统进行了观测,而且利用先前的观测值对系统进行过平差处理,获得了参数估值,以及参数的精度信息(方差-协方差阵)。序贯平差要解决的问题就是如何利用先前获得的参数估值和精度信息,与新增加的观测值一起处理,得到更加优化的参数估值。设总的观测方程为Sequential adjustment is generally used in the case of data staging, that is to say, the system is observed at different times, and the system is adjusted using the previous observations to obtain parameter estimates and parameter accuracy information (variance-covariance matrix). The problem to be solved in sequential adjustment is how to use the previously obtained parameter estimation and precision information and process them together with the newly added observations to obtain a more optimized parameter estimation. Let the overall observation equation be
将观测数据分为两组,则观测方程可以写成以下形式Divide the observation data into two groups, then the observation equation can be written in the following form
假设上式第一个方程中的观测值个数足够,按最小二乘原理求解第一个方程,得Assuming that the number of observations in the first equation of the above formula is sufficient, solve the first equation according to the principle of least squares, and get
上式中的P1表示第一个方程中观测值的权阵。同时求解两个方程,得:P1 in the above formula represents the weight matrix of observations in the first equation. Solving both equations simultaneously, we get:
上式中的P2表示第二个方程中观测值的权阵。根据矩阵反演公式,可以导出序贯平差的计算公式,如下所示P2 in the above formula represents the weight matrix of observations in the second equation. According to the matrix inversion formula, the calculation formula of sequential adjustment can be derived, as shown below
现在介绍采用序贯平差实现惯导系统对视觉系统的修正。视觉导航解算完成以后,可以得到相机参数的估值Xvo,通过精度评定,可以得到参数之间的方差-协方差阵P(Xvo)。惯导系统通过导航计算,可以获得相应时刻的导航参数Xins,同时根据惯导系统的误差方程,按协方差传播律,计算参数之间的方差-协方差阵,记为P(Xins)。从惯导系统的导航参数Xins中分离出位置rins和姿态ains,从方差-协方差阵P(Xins)中分离出位置与姿态相关的子阵P(rins,ains),考虑相机与IMU之间的几何关系,转化为对视觉参数的观测值,于是惯导系统修正视觉系统的函数模型可以写成以下形式:Now introduce the use of sequential adjustment to realize the correction of the visual system by the inertial navigation system. After the visual navigation solution is completed, the estimated X vo of the camera parameters can be obtained, and the variance-covariance matrix P(X vo ) between the parameters can be obtained through the accuracy evaluation. The inertial navigation system can obtain the navigation parameter X ins at the corresponding moment through navigation calculation, and at the same time, according to the error equation of the inertial navigation system, according to the covariance propagation law, calculate the variance-covariance matrix between the parameters, which is recorded as P(X ins ) . The position r ins and the attitude a ins are separated from the navigation parameter X ins of the inertial navigation system, and the sub-array P(r ins , a ins ) related to the position and attitude is separated from the variance-covariance matrix P(X ins ), Considering the geometric relationship between the camera and the IMU, it is transformed into the observed value of the visual parameters, so the function model of the inertial navigation system to correct the visual system can be written in the following form:
将上式整理成观测方程的形式,如下所示:Put the above formula into the form of observation equation, as follows:
上式中,矩阵B的形式如下In the above formula, the form of matrix B is as follows
B=(I000)(22)B=(I000)(22)
观测值的随机模型(方差-协方差阵)如下:The stochastic model (variance-covariance matrix) of the observations is as follows:
根据上面的函数模型和随机模型,可以导出视觉系统的修正公式,如下所示According to the above function model and stochastic model, the correction formula of the vision system can be derived as follows
上式中,xvo是指由视觉导航解算得到的视觉参数,经过修正后的视觉参数,是经修正后的方差-协方差阵。采用上式计算,可以修正视觉系统的所有参数,包括左右相机的位置、姿态,以及所有特征点的物方坐标。In the above formula, x vo refers to the visual parameters obtained by the visual navigation solution, The corrected visual parameters, is the modified variance-covariance matrix. Using the above calculation, all parameters of the vision system can be corrected, including the position and attitude of the left and right cameras, and the object space coordinates of all feature points.
如果纯粹从算法上讲,第1步与第2步其实是并列运行的,第3步与第4步也是并列运行的。即首先视觉系统与惯导系统各自进行导航解算,求得相应的参数,然后两个子系统相互辅助,使得每个子系统都得到修正,获得最优估值。第3步获得的IMU最优估值与第4步中获得的相机最优估值都是对整个系统的最优估计,或者说是对载体运行状态的最优估计。理论上来讲,在不考虑计算误差的情况下,与是等价的。换句话说,如果将相机与IMU之间的几何关系考虑进去,那么与应该一致。From a purely algorithmic point of view, step 1 and step 2 are actually run in parallel, and steps 3 and 4 are also run in parallel. That is, firstly, the visual system and the inertial navigation system perform navigation calculations separately to obtain the corresponding parameters, and then the two subsystems assist each other, so that each subsystem can be corrected and the optimal estimate can be obtained. The optimal estimate of the IMU obtained in step 3 with the best estimate of the camera obtained in step 4 Both are the optimal estimation of the whole system, or the optimal estimation of the operating state of the carrier. Theoretically speaking, without considering the calculation error, and are equivalent. In other words, if the geometric relationship between the camera and the IMU is taken into account, then and should be consistent.
为了验证本发明方法的可靠性和稳定性,进行了多次实验和数据处理,以下介绍其中一次实验过程和数据处理结果。实验地点位于北京,载体车上装有立体相机和IMU等数据采集设备。另外还加装了一个GPS接收机,与基站GPS作差分,差分结果达到厘米级的精度,因此作为参考值来检验本发明方法的精度。在实验前已经完成了立体相机内方位元素检校和相对外方位元素检校,以及相机与IMU之间的相对位置和姿态的检校。实验过程中,首先利用已知绝对坐标信息的地标点进行相机外方位元素初始化。初始化完成之后,控制载体车在实验区采集IMU数据、影像数据、GPS数据。IMU的输出频率为100Hz,其陀螺漂移为0.005°/s,加速度计零偏为10mg,通过IMU的内置时钟授时立体相机以2Hz的频率通过同步曝光获取立体影像,立体相机的水平和垂直视场角分别为42°和31°。GPS接收机的输出频率为1Hz。实验中,移动实验平台累计行进约1150m,实验时长约45min,获取了约4300帧立体影像。In order to verify the reliability and stability of the method of the present invention, several experiments and data processing have been carried out, and one of the experiment process and data processing results will be introduced below. The experiment site is located in Beijing, and the carrier vehicle is equipped with data acquisition equipment such as stereo cameras and IMUs. In addition, a GPS receiver is added to make a difference with the GPS of the base station, and the difference result reaches centimeter-level accuracy, so it is used as a reference value to check the accuracy of the method of the present invention. Before the experiment, the internal orientation element calibration and the relative external orientation element calibration of the stereo camera, as well as the relative position and attitude calibration between the camera and the IMU, have been completed. During the experiment, firstly, the out-of-camera orientation elements are initialized by using the landmark points with known absolute coordinate information. After the initialization is completed, control the carrier vehicle to collect IMU data, image data, and GPS data in the experimental area. The output frequency of the IMU is 100Hz, its gyro drift is 0.005°/s, and the accelerometer zero bias is 10mg. The stereo camera is timed by the built-in clock of the IMU to obtain a stereo image through synchronous exposure at a frequency of 2 Hz. The horizontal and vertical field of view of the stereo camera The angles are 42° and 31°, respectively. The output frequency of the GPS receiver is 1Hz. In the experiment, the mobile experimental platform traveled a total of about 1150m, the experiment lasted about 45min, and acquired about 4300 frames of stereoscopic images.
对实验数据采用两种导航方法进行处理。方法一是纯视觉导航解算,当视觉导航解算过程出现失败时,利用匀速运动模型预测载体的运动,填补视觉导航缺口,方法二是视觉导航与惯性导航进行组合。实际上,作者也尝试进行了纯惯性导航计算,但是对于长达45min的实验数据,其导航误差达到10000m以上的量级,因此纯惯性导航已经没有意义,这里不作讨论。以下给出两种不同方法处理的结果,以及GPS参考轨迹,见图4。Two navigation methods are used to process the experimental data. The first method is pure visual navigation calculation. When the visual navigation calculation process fails, the uniform motion model is used to predict the movement of the carrier to fill the visual navigation gap. The second method is to combine visual navigation and inertial navigation. In fact, the author also tried pure inertial navigation calculation, but for the 45min long experimental data, the navigation error reached the order of 10,000m or more, so pure inertial navigation is meaningless and will not be discussed here. The results of two different methods are given below, as well as the GPS reference trajectory, as shown in Figure 4.
图4中的图例,GPS表示差分GPS结果,VO/INS表示视觉/惯性组合导航结果,VO表示纯视觉导航结果。从图中可以看出,视觉/惯性组合导航的轨迹与GPS轨迹很接近,其精度高于纯视觉导航的结果。表1列出了两种不同导航方法的误差情况。从表中可以看出,视觉/惯性组合导航在1150m的行驶距离中,累计位置误差为8.6m,相对精度为0.75%。In the legend in Figure 4, GPS indicates the result of differential GPS, VO/INS indicates the result of integrated visual/inertial navigation, and VO indicates the result of pure visual navigation. It can be seen from the figure that the trajectory of visual/inertial integrated navigation is very close to the GPS trajectory, and its accuracy is higher than that of pure visual navigation. Table 1 lists the error situations of two different navigation methods. It can be seen from the table that the accumulative position error is 8.6m and the relative accuracy is 0.75% in the 1150m driving distance of the visual/inertial integrated navigation.
表1不同导航方法的误差情况Table 1 Error situation of different navigation methods
实验过程中,载体车有时横跨光秃的路面,有时转弯的角度较大,导致出现6次短时间的视觉导航解算失败,这时组合系统完全依靠惯性导航提供导航参数。从图4中的VO/INS组合轨迹可以看出,在视觉导航解算失败时,系统并未出现明显的精度下降,这体现出本方法的可靠性和高精度。During the experiment, the carrier vehicle sometimes crossed the bare road, and sometimes turned at a large angle, resulting in 6 short-term visual navigation calculation failures. At this time, the combined system completely relied on inertial navigation to provide navigation parameters. From the VO/INS combined trajectory in Figure 4, it can be seen that when the visual navigation solution fails, the system does not experience a significant decrease in accuracy, which reflects the reliability and high precision of this method.
图5和图6显示了在视觉系统修正惯导系统的过程中,卡尔曼滤波器的新息情况。在本发明方法所采用的卡尔曼滤波器中,新息表示的是惯性导航计算得到的导航参数与视觉导航解算得到的导航参数两者之间的差值。Figures 5 and 6 show the innovations of the Kalman filter during the correction of the INS by the vision system. In the Kalman filter used in the method of the present invention, the innovation represents the difference between the navigation parameters calculated by inertial navigation and the navigation parameters calculated by visual navigation.
从图5可以看出,位置差值的最大值在0.15m以内,且绝大部分情况下都在0.05m以内,从图6可以看出,姿态差值的最大值在0.1°以内,且绝大部分情况下都在0.01°以内,这说明滤波过程是比较平稳的,视觉导航和惯性导航两个系统能够很好的组合,结合前面采用GPS绝对参考值的精度评估,充分显示了本发明所提出的组合算法的可靠性和优越性。It can be seen from Figure 5 that the maximum value of the position difference is within 0.15m, and in most cases it is within 0.05m. It can be seen from Figure 6 that the maximum value of the attitude difference is within 0.1°, and absolutely In most cases, it is within 0.01°, which shows that the filtering process is relatively stable, and the two systems of visual navigation and inertial navigation can be well combined. In combination with the accuracy evaluation using the GPS absolute reference value above, it fully demonstrates the accuracy of the present invention. The reliability and superiority of the proposed combination algorithm.
Claims (6)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201210161621.9A CN103424114B (en) | 2012-05-22 | 2012-05-22 | A kind of full combined method of vision guided navigation/inertial navigation |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201210161621.9A CN103424114B (en) | 2012-05-22 | 2012-05-22 | A kind of full combined method of vision guided navigation/inertial navigation |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN103424114A CN103424114A (en) | 2013-12-04 |
| CN103424114B true CN103424114B (en) | 2016-01-20 |
Family
ID=49649216
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201210161621.9A Expired - Fee Related CN103424114B (en) | 2012-05-22 | 2012-05-22 | A kind of full combined method of vision guided navigation/inertial navigation |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN103424114B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106885571A (en) * | 2017-03-07 | 2017-06-23 | 辽宁工程技术大学 | A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image |
Families Citing this family (55)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102016551B1 (en) * | 2014-01-24 | 2019-09-02 | 한화디펜스 주식회사 | Apparatus and method for estimating position |
| CN103994765B (en) * | 2014-02-27 | 2017-01-11 | 北京工业大学 | Positioning method of inertial sensor |
| CN103914065B (en) * | 2014-03-24 | 2016-09-07 | 深圳市大疆创新科技有限公司 | The method and apparatus that flight state is revised in real time |
| EP3081902B1 (en) | 2014-03-24 | 2019-04-17 | SZ DJI Technology Co., Ltd. | Method and apparatus for correcting aircraft state in real time |
| CN103925926B (en) * | 2014-04-25 | 2016-08-24 | 哈尔滨工程大学 | A kind of quaternary number measuring method based on CAMERA/MIMU indoor integrated navigation system |
| CN105675013B (en) * | 2014-11-21 | 2019-03-01 | 中国飞行试验研究院 | Civil aircraft inertial navigation dynamic calibration method |
| CN104808685A (en) * | 2015-04-27 | 2015-07-29 | 中国科学院长春光学精密机械与物理研究所 | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle |
| CN107850901B (en) * | 2015-05-23 | 2021-04-16 | 深圳市大疆创新科技有限公司 | Sensor fusion using inertial and image sensors |
| WO2016187757A1 (en) * | 2015-05-23 | 2016-12-01 | SZ DJI Technology Co., Ltd. | Sensor fusion using inertial and image sensors |
| CN105043392B (en) * | 2015-08-17 | 2018-03-02 | 中国人民解放军63920部队 | A kind of aircraft pose determines method and device |
| CN105208348B (en) * | 2015-10-08 | 2018-08-21 | 成都时代星光科技有限公司 | The aerial unmanned plane of railway line is patrolled and real time image collection Transmission system automatically |
| CN106708066B (en) * | 2015-12-20 | 2019-07-26 | 中国电子科技集团公司第二十研究所 | Autonomous landing method of UAV based on vision/inertial navigation |
| CN105588563B (en) * | 2016-01-15 | 2018-06-12 | 武汉光庭科技有限公司 | Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving |
| US10132933B2 (en) * | 2016-02-02 | 2018-11-20 | Qualcomm Incorporated | Alignment of visual inertial odometry and satellite positioning system reference frames |
| CN105698765B (en) * | 2016-02-22 | 2018-09-18 | 天津大学 | Object pose method under double IMU monocular visions measurement in a closed series noninertial systems |
| CN105953795B (en) * | 2016-04-28 | 2019-02-26 | 南京航空航天大学 | A navigation device and method for patrolling the surface of a spacecraft |
| CN106052683A (en) * | 2016-05-25 | 2016-10-26 | 速感科技(北京)有限公司 | Robot motion attitude estimating method |
| CN105973264A (en) * | 2016-07-21 | 2016-09-28 | 触景无限科技(北京)有限公司 | Intelligent blind guiding system |
| CN106843470B (en) * | 2016-12-28 | 2020-04-03 | 歌尔科技有限公司 | A viewing angle control method, device and VR system |
| CN108253940B (en) | 2016-12-29 | 2020-09-22 | 东莞前沿技术研究院 | Positioning method and device |
| US10371530B2 (en) * | 2017-01-04 | 2019-08-06 | Qualcomm Incorporated | Systems and methods for using a global positioning system velocity in visual-inertial odometry |
| CN106705965A (en) * | 2017-01-12 | 2017-05-24 | 苏州中德睿博智能科技有限公司 | Scene three-dimensional data registration method and navigation system error correction method |
| CN107063246A (en) * | 2017-04-24 | 2017-08-18 | 齐鲁工业大学 | A Loose Combination Navigation Method of Visual Navigation/Inertial Navigation |
| CN107389088B (en) * | 2017-05-27 | 2020-11-17 | 纵目科技(上海)股份有限公司 | Error correction method, device, medium and equipment for vehicle-mounted inertial navigation |
| CN108253942B (en) * | 2017-06-08 | 2020-05-01 | 中国科学院遥感与数字地球研究所 | Method for improving oblique photography measurement space-three quality |
| CN107197154B (en) * | 2017-06-22 | 2020-04-21 | 北京奇艺世纪科技有限公司 | A panoramic video stabilization method and device |
| CN107621266B (en) * | 2017-08-14 | 2020-12-15 | 上海宇航系统工程研究所 | Space non-cooperative target relative navigation method based on feature point tracking |
| CN109073407B (en) * | 2017-10-26 | 2022-07-05 | 深圳市大疆创新科技有限公司 | Drift calibration method and device of inertial measurement unit and unmanned aerial vehicle |
| CN110361003B (en) * | 2018-04-09 | 2023-06-30 | 中南大学 | Information Fusion Method, Device, Computer Equipment, and Computer-Readable Storage Medium |
| CN108801248B (en) * | 2018-06-22 | 2020-09-15 | 深圳市北斗产业互联网研究院 | Planar vision inertial navigation method based on UKF |
| CN108937742A (en) * | 2018-09-06 | 2018-12-07 | 苏州领贝智能科技有限公司 | A kind of the gyroscope angle modification method and sweeper of sweeper |
| CN109520476B (en) * | 2018-10-24 | 2021-02-19 | 天津大学 | System and method for measuring dynamic pose of rear intersection based on inertial measurement unit |
| CN109724597B (en) * | 2018-12-19 | 2021-04-02 | 上海交通大学 | An inertial navigation solution method and system based on function iterative integration |
| CN109827569A (en) * | 2019-02-21 | 2019-05-31 | 奇瑞汽车股份有限公司 | Unmanned vehicle localization method and system |
| CN111750896B (en) * | 2019-03-28 | 2022-08-05 | 杭州海康机器人技术有限公司 | Holder calibration method and device, electronic equipment and storage medium |
| CN110068325A (en) * | 2019-04-11 | 2019-07-30 | 同济大学 | A kind of lever arm error compensating method of vehicle-mounted INS/ visual combination navigation system |
| CN111829552B (en) * | 2019-04-19 | 2023-01-06 | 北京魔门塔科技有限公司 | Error correction method and device for visual inertial system |
| CN110081881B (en) * | 2019-04-19 | 2022-05-10 | 成都飞机工业(集团)有限责任公司 | Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology |
| CN110177220B (en) * | 2019-05-23 | 2020-09-01 | 上海图趣信息科技有限公司 | Camera with external time service function and control method thereof |
| CN110412635B (en) * | 2019-07-22 | 2023-11-24 | 武汉大学 | GNSS/SINS/visual tight combination method under environment beacon support |
| CN113008271B (en) * | 2019-08-15 | 2024-07-12 | 深圳市瑞立视多媒体科技有限公司 | Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof |
| CN112461258A (en) * | 2019-09-06 | 2021-03-09 | 北京三快在线科技有限公司 | Parameter correction method and device |
| CN110645975A (en) * | 2019-10-16 | 2020-01-03 | 北京华捷艾米科技有限公司 | Monocular vision positioning IMU (inertial measurement unit) auxiliary tracking method and device |
| CN111432113B (en) * | 2019-12-30 | 2022-04-05 | 科沃斯机器人股份有限公司 | Data calibration method, device and storage medium |
| CN111207742B (en) * | 2020-01-17 | 2020-12-15 | 西安科技大学 | A Shearer Positioning and Attitude Method with Additional External Orientation Element Constraints |
| CN111780752B (en) * | 2020-06-10 | 2022-01-04 | 北京航天控制仪器研究所 | Method for improving inertial guidance precision with observable attitude error |
| CN112304304B (en) * | 2020-10-23 | 2023-04-18 | 国网智能科技股份有限公司 | Patrol unmanned aerial vehicle, system and method suitable for transformer substation |
| CN113012224B (en) * | 2021-03-12 | 2022-06-03 | 浙江商汤科技开发有限公司 | Positioning initialization method and related device, equipment and storage medium |
| CN113109831A (en) * | 2021-03-26 | 2021-07-13 | 国家电网有限公司 | Data processing method for polling transmission line by using laser radar |
| CN113932804B (en) * | 2021-09-17 | 2024-08-30 | 四川腾盾科技有限公司 | Positioning method combining airport runway vision and GNSS/inertial navigation |
| CN114252033B (en) * | 2021-11-19 | 2025-03-28 | 科大讯飞(苏州)科技有限公司 | A method for determining a motion state and related equipment |
| CN114485574B (en) * | 2021-12-21 | 2023-03-21 | 武汉大学 | Three-line array image POS-assisted ground positioning method based on Kalman filter model |
| CN114942026A (en) * | 2022-06-01 | 2022-08-26 | 四川大学 | Multimodal 3D Image Navigation System Based on Intelligent Data |
| CN115164885B (en) * | 2022-07-19 | 2025-09-30 | 智道网联科技(北京)有限公司 | Image inverse perspective transformation method, device, electronic device, and storage medium |
| CN120385357B (en) * | 2025-06-27 | 2025-08-26 | 同济大学 | Special vehicle-mounted combined attitude determination positioning method based on ground visual angle odometer |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101270993A (en) * | 2007-12-12 | 2008-09-24 | 北京航空航天大学 | A long-distance high-precision autonomous integrated navigation and positioning method |
| CN101532841A (en) * | 2008-12-30 | 2009-09-16 | 华中科技大学 | Aircraft Navigation and Positioning Method Based on Landmark Acquisition and Tracking |
| CN101576384A (en) * | 2009-06-18 | 2009-11-11 | 北京航空航天大学 | Indoor movable robot real-time navigation method based on visual information correction |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI397671B (en) * | 2009-12-16 | 2013-06-01 | Ind Tech Res Inst | System and method for locating carrier, estimating carrier posture and building map |
-
2012
- 2012-05-22 CN CN201210161621.9A patent/CN103424114B/en not_active Expired - Fee Related
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101270993A (en) * | 2007-12-12 | 2008-09-24 | 北京航空航天大学 | A long-distance high-precision autonomous integrated navigation and positioning method |
| CN101532841A (en) * | 2008-12-30 | 2009-09-16 | 华中科技大学 | Aircraft Navigation and Positioning Method Based on Landmark Acquisition and Tracking |
| CN101576384A (en) * | 2009-06-18 | 2009-11-11 | 北京航空航天大学 | Indoor movable robot real-time navigation method based on visual information correction |
Non-Patent Citations (4)
| Title |
|---|
| 基于人工标志的视觉/SINS组合导航算法研究;宋琳娜等;《科学技术与工程》;20120229;第12卷(第4期);863页左栏 * |
| 基于双目视觉和惯性器件的微小型无人机运动状态估计方法;李建等;《航空学报》;20111225;第32卷(第12期);全文 * |
| 煤矿救灾机器人的惯性/视觉组合导航方法研究;朱晓飞;《矿山机械》;20101231;第38卷(第4期);全文 * |
| 紧耦合INS/视觉相对位姿测量方法;王龙等;《中国惯性技术学报》;20111231;第19卷(第6期);687页右栏、688页右栏、689页右栏 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106885571A (en) * | 2017-03-07 | 2017-06-23 | 辽宁工程技术大学 | A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103424114A (en) | 2013-12-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN103424114B (en) | A kind of full combined method of vision guided navigation/inertial navigation | |
| Kelly et al. | Combined visual and inertial navigation for an unmanned aerial vehicle | |
| CN113203418B (en) | GNSSINS visual fusion positioning method and system based on sequential Kalman filtering | |
| CN101788296B (en) | SINS/CNS deep integrated navigation system and realization method thereof | |
| CN102538781B (en) | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method | |
| Hemann et al. | Long-range GPS-denied aerial inertial navigation with LIDAR localization | |
| CN103994763B (en) | A SINS/CNS deep integrated navigation system of a Mars rover and its realization method | |
| CN106767797B (en) | inertial/GPS combined navigation method based on dual quaternion | |
| US11274788B2 (en) | Gimbal pose correction method and device | |
| CN104374388B (en) | Flight attitude determining method based on polarized light sensor | |
| US20170023937A1 (en) | Systems, devices, and methods for on-board sensing and control of micro aerial vehicles | |
| CN107806874B (en) | A kind of inertial navigation polar region Initial Alignment Method of vision auxiliary | |
| CN104596546B (en) | A kind of posture output compensation method of single-shaft-rotation inertial navigation system | |
| CN108036785A (en) | A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion | |
| CN105371844B (en) | A kind of inertial navigation system initial method based on inertia/astronomical mutual assistance | |
| CN103759730A (en) | Collaborative navigation system based on navigation information bilateral fusion for pedestrian and intelligent mobile carrier and navigation method thereof | |
| CN104655135B (en) | A kind of aircraft visual navigation method based on terrestrial reference identification | |
| CN111351482A (en) | Integrated Navigation Method for Multi-rotor Aircraft Based on Error State Kalman Filter | |
| Vetrella et al. | Cooperative UAV navigation based on distributed multi-antenna GNSS, vision, and MEMS sensors | |
| CN106292677B (en) | Attitude control method and system based on sidereal hour angle | |
| Wang et al. | A high accuracy multiplex two-position alignment method based on SINS with the aid of star sensor | |
| Morales et al. | Collaborative autonomous vehicles with signals of opportunity aided inertial navigation systems | |
| Konrad et al. | State estimation for a multirotor using tight-coupling of gnss and inertial navigation | |
| CN104482942B (en) | A kind of optimal Two position method based on inertial system | |
| Agarwal et al. | Evaluation of a commercially available autonomous visual inertial odometry solution for indoor navigation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160120 Termination date: 20180522 |
|
| CF01 | Termination of patent right due to non-payment of annual fee |