CN103065323B - Subsection space aligning method based on homography transformational matrix - Google Patents
Subsection space aligning method based on homography transformational matrix Download PDFInfo
- Publication number
- CN103065323B CN103065323B CN201310013045.8A CN201310013045A CN103065323B CN 103065323 B CN103065323 B CN 103065323B CN 201310013045 A CN201310013045 A CN 201310013045A CN 103065323 B CN103065323 B CN 103065323B
- Authority
- CN
- China
- Prior art keywords
- wave radar
- mtr
- mtd
- distance
- millimeter wave
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Landscapes
- Radar Systems Or Details Thereof (AREA)
Abstract
本发明公开了一种基于单应性变换矩阵的分段空间对准方法,通过对较大的标定距离进行分段,针对每一分段分别求得摄像机和毫米波雷达坐标系之间的单应性变换矩阵,避免了现有技术中由于用同一个单应性变换矩阵表达两个传感器之间的坐标关系引起的误差,从而能够实现对较大标定距离目标探测的空间对准;通过推导表征摄像机和毫米波雷达的不同坐标系之间的关系,最后采用单应变换矩阵N表征两者的坐标系关系,用两个传感器分别获得的目标数据再解得单应变换矩阵N,避免了求解缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,大大简化了运算过程,节省运算时间。
The invention discloses a segmented space alignment method based on a homography transformation matrix. By segmenting a larger calibration distance, the homography between the camera and the millimeter-wave radar coordinate system is obtained for each segment. The homography transformation matrix avoids the error caused by using the same homography transformation matrix to express the coordinate relationship between the two sensors in the prior art, so as to realize the spatial alignment of the target detection with a large calibration distance; by deriving Characterize the relationship between the different coordinate systems of the camera and the millimeter-wave radar, and finally use the homography transformation matrix N to represent the relationship between the two coordinate systems, and use the target data obtained by the two sensors to solve the homography transformation matrix N, avoiding the Solving the camera internal parameter matrix composed of scaling factor, focal length, etc. and the camera external parameter matrix composed of rotation matrix and translation vector greatly simplifies the operation process and saves operation time.
Description
技术领域technical field
本发明涉及无人车多传感器信息融合技术领域,具体涉及一种基于单应性变换矩阵的分段空间对准方法。The invention relates to the technical field of multi-sensor information fusion of unmanned vehicles, in particular to a segmented space alignment method based on a homography transformation matrix.
背景技术Background technique
无人车又称室外智能移动机器人,是一种集环境感知、动态决策与规划、行为控制与执行等多功能为一体的智能化程度很高的装置,其对环境感知的快速性、准确性与多传感器信息融合技术密不可分。多传感器信息融合技术就是计算机充分利用各传感器资源,通过对各种量测信息的合理支配和使用,在空间和时间上把互补与冗余信息依据某种优化准则结合起来,产生对观测环境的一致性解释或描述,同时产生新的融合结果。在环境感知模块,视觉传感器和毫米波雷达是常用的两种传感器。视觉传感器检测范围广、能够获得外界环境中目标的尺寸和轮廓信息,但它易受外界因素影响,存在目标缺失问题。而毫米波雷达分辨率高、抗干扰能力强,可在各种天气环境下准确获得目标的距离、相对速度和方位角信息,但不能识别目标形状和大小,因此,利用这种互补特性融合两者信息,得到更加全面、可靠的环境信息。而空间对准是两者信息融合的前提。空间对准实质是估计摄像机和雷达坐标系间的变换矩阵关系。目前,传统空间对准方法中,在20米的标定距离范围内,在不同的距离点上,随机对目标上的点进行探测,分别得到目标在摄像机中的坐标系表达和雷达坐标系中的表达,根据两个传感器获取的数据,估计由缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,该计算过程比较繁琐,也容易引入误差;另外,根据上述算法对超过20米的标定距离的目标进行求解变换矩阵时,由于范围较大,因此巨大的误差,导致空间对准失败。Unmanned vehicles, also known as outdoor intelligent mobile robots, are highly intelligent devices that integrate environmental perception, dynamic decision-making and planning, behavior control and execution. It is inseparable from multi-sensor information fusion technology. Multi-sensor information fusion technology is that the computer makes full use of the resources of each sensor, through the reasonable control and use of various measurement information, and combines complementary and redundant information in space and time according to a certain optimization criterion, resulting in an accurate observation of the observation environment. Consistent explanations or descriptions, while generating new fusion results. In the environmental perception module, vision sensors and millimeter-wave radar are two commonly used sensors. The visual sensor has a wide detection range and can obtain the size and contour information of the target in the external environment, but it is easily affected by external factors, and there is a problem of missing targets. The millimeter-wave radar has high resolution and strong anti-interference ability, and can accurately obtain the distance, relative speed and azimuth information of the target in various weather environments, but cannot identify the shape and size of the target. Therefore, using this complementary feature to fuse the two information about the environment and obtain more comprehensive and reliable environmental information. Spatial alignment is the premise of information fusion between the two. The essence of spatial alignment is to estimate the transformation matrix relationship between the camera and radar coordinate systems. At present, in the traditional space alignment method, within the calibrated distance range of 20 meters, at different distance points, the points on the target are randomly detected, and the coordinate system expression of the target in the camera and the coordinate system in the radar coordinate system are respectively obtained. Expression, according to the data obtained by the two sensors, the camera internal parameter matrix composed of scaling factor, focal length, etc., and the camera external parameter matrix composed of rotation matrix and translation vector are estimated. The calculation process is cumbersome and easy to introduce errors; in addition, When solving the transformation matrix for a target with a calibration distance of more than 20 meters according to the above algorithm, due to the large range, there is a huge error, which leads to the failure of spatial alignment.
发明内容Contents of the invention
有鉴于此,本发明提供了一种基于单应性变换矩阵的分段空间对准方法,能够在较大的标定距离范围内实现无人车装载的摄像机和毫米波雷达的空间对准,同时还能简化求解单应性变换矩阵的计算过程。In view of this, the present invention provides a segmented spatial alignment method based on the homography transformation matrix, which can realize the spatial alignment of the camera mounted on the unmanned vehicle and the millimeter-wave radar within a large calibration distance range, and at the same time It can also simplify the calculation process of solving the homography transformation matrix.
本发明的一种基于单应性变换矩阵的分段空间对准方法,包括如下步骤:A kind of segmentation space alignment method based on homography transformation matrix of the present invention, comprises the following steps:
步骤1:建立摄像机坐标系与毫米波雷达坐标系之间的基于单应性变换矩阵的关系:Step 1: Establish the relationship between the camera coordinate system and the millimeter-wave radar coordinate system based on the homography transformation matrix:
定义摄像机的图像坐标系O′uv,其中O′位于摄像机成像平面的左上角;u轴与摄像机扫描行方向平行;v轴垂直于摄像机扫描行方向;Define the image coordinate system O'uv of the camera, where O' is located at the upper left corner of the camera imaging plane; the u-axis is parallel to the direction of the camera scanning line; the v-axis is perpendicular to the direction of the camera scanning line;
定义O″″ρθ为毫米波雷达极坐标系,O″″为毫米波雷达表面的中心;ρ为目标与毫米波雷达间的直线距离;θ为目标偏离毫米波雷达扫描平面中心线的角度,则摄像机的图像坐标系O′uv与毫米波雷达极坐标系O″″ρθ之间关系表示为:Define O″″ρθ as the polar coordinate system of the millimeter-wave radar, O″″ is the center of the millimeter-wave radar surface; ρ is the straight-line distance between the target and the millimeter-wave radar; θ is the angle from the target to the centerline of the millimeter-wave radar scanning plane, Then the relationship between the image coordinate system O′uv of the camera and the polar coordinate system O″″ρθ of the millimeter-wave radar is expressed as:
其中,
步骤2:确定无人车与标定目标之间的合适的标定距离:Step 2: Determine the appropriate calibration distance between the unmanned vehicle and the calibration target:
定义O″″XrYrZr表示毫米波雷达直角坐标系,O″″为毫米波雷达表面的中心;Yr轴为毫米波雷达扫描平面中心线,垂直于毫米波雷达表面,指向正前方;Xr轴与Yr垂直,指向右侧;Zr轴垂直于Xr、Yr确定的平面,指向上方;Definition O″″X r Y r Z r represents the rectangular coordinate system of the millimeter-wave radar, O″″ is the center of the surface of the millimeter-wave radar; Y r axis is the centerline of the scanning plane of the millimeter-wave radar, perpendicular to the surface of the millimeter-wave radar, pointing to the positive Front; X r axis is perpendicular to Y r and points to the right; Z r axis is perpendicular to the plane determined by X r and Y r and points upward;
则毫米波雷达直角坐标系和毫米波雷达极坐标系之间的关系为:Then the relationship between the millimeter-wave radar rectangular coordinate system and the millimeter-wave radar polar coordinate system is:
标定目标与无人车的距离在毫米波雷达直角坐标系纵轴Yr上的投影称为标定距离;在毫米波雷达的探测范围内,根据无人车的最大运动速度,确定合适的标定距离L;将标定距离L由近到远分为近距范围L1和远距范围L2,在近距范围L1内,将其均分成m1段,在远距范围L2内,将其均分成m2段,且保证L1/m1小于L2/m2;The projection of the distance between the calibration target and the unmanned vehicle on the vertical axis Y r of the millimeter-wave radar rectangular coordinate system is called the calibration distance; within the detection range of the millimeter-wave radar, the appropriate calibration distance is determined according to the maximum movement speed of the unmanned vehicle L; Divide the calibration distance L from near to far into short-distance range L1 and long-distance range L2. In the short-distance range L1, divide it into m1 segments, and in the long-distance range L2, divide it into m2 segments. And ensure that L1/m1 is less than L2/m2;
步骤3:通过无人车中装载的摄像机和毫米波雷达分别采集标定目标的图像和数据信息:Step 3: Collect images and data information of the calibrated target through the camera and millimeter-wave radar mounted in the unmanned vehicle:
将标定目标分别放置在步骤2中将标定距离L分成的不同分段处,毫米波雷达和摄像机分别对上述m1+m2段距离处的目标进行探测,在探测时,针对每段距离处的目标,将目标沿Yr轴方向均分成m行,再将每一行沿Xr轴方向均分成h小段,控制毫米波雷达获取每个小段的坐标数据(XM rk,YM rk),控制摄像机拍摄每小段的图像数据fk M,其中M=1,...,(m1+m2),k=1,2,...,mh;Place the calibration targets in different segments that divide the calibration distance L in step 2. The millimeter-wave radar and camera detect the targets at the above m1+m2 distances respectively. When detecting, target at each distance , divide the target into m rows along the Y r axis direction, and then divide each row into h segments along the X r axis direction, control the millimeter wave radar to obtain the coordinate data (X M rk , Y M rk ) of each segment, and control the camera Take image data f k M of each small segment, where M=1,...,(m1+m2), k=1,2,...,mh;
步骤4:针对步骤3中摄像机获得的每一个分段内每一个小段的图像数据fk M,分别计算图像的质心坐标(uk M,vk M);Step 4: For the image data f k M of each small segment in each segment obtained by the camera in step 3, calculate the centroid coordinates (u k M , v k M ) of the image respectively;
步骤5:求解表示毫米波雷达坐标系与摄像机坐标系之间关系的单应性空间变换矩阵:Step 5: Solve the homography space transformation matrix representing the relationship between the mmWave radar coordinate system and the camera coordinate system:
针对整个标定距离L中分出的每一段距离获得的所有小段对应的毫米波雷达坐标数据(YM rk)和摄像机的图像数据(uM k,vM k)组成每个分段内对应的数据集,将每个数据集分别代入到式(7)和(7)′中,得到:The millimeter-wave radar coordinate data ( Y M rk ) and camera image data (u M k , v M k ) constitute the corresponding data set in each segment, and each data set is substituted into formula (7) and (7)′ respectively to obtain:
和
定义
步骤6:实现视觉传感器和毫米波雷达的空间对准:Step 6: Achieve spatial alignment of vision sensor and mmWave radar:
根据毫米波雷达扫描的标定目标的实际距离,判断该距离在步骤2中哪个分段内,在步骤5计算得到的m1+m2个结果中查找该距离对应的单应性空间变换矩阵,实现空间对准。According to the actual distance of the calibration target scanned by the millimeter-wave radar, determine which segment the distance is in in step 2, and find the homography space transformation matrix corresponding to the distance in the m1+m2 results calculated in step 5 to realize the space alignment.
在步骤2中,当标定距离为50米时,0-20米为近距范围,均分成4段;20-50米为远距范围,均分成3段。In step 2, when the calibration distance is 50 meters, 0-20 meters is the short-range range, which is divided into 4 segments; 20-50 meters is the long-range range, which is divided into 3 segments.
所述步骤4中计算标定目标图像中的每一小段图像的质心坐标的方法如下:The method for calculating the centroid coordinates of each small segment image in the calibration target image in the step 4 is as follows:
S40、手动选取含有标定目标的候选区域;S40. Manually select a candidate area containing the calibration target;
S41、对候选区域图像进行中值滤波,消除图像中的噪声;S41. Perform median filtering on the image of the candidate area to eliminate noise in the image;
S42、对候选区域图像进行Sobel算子边缘检测,得到二值化的标定目标边缘图像;S42. Perform Sobel operator edge detection on the candidate area image to obtain a binarized calibration target edge image;
S43、在以像素为单位的图像坐标系中,沿u轴寻找标定目标边缘图像中的坐标最小值和最大值的u轴像素点坐标umin,umax,沿v轴寻找标定目标边缘图像中的坐标最小值和最大值的v轴像素点坐标vmin,vmax,将上述4点按顺时针或者逆时针方向用直线连接,形成一个四边形区域,在该四边形区域内,利用公式
本发明具有如下有益效果:The present invention has following beneficial effect:
1)通过对较大的标定距离进行分段,针对每一分段分别求得摄像机和毫米波雷达坐标系之间的单应性变换矩阵,避免了现有技术中由于用同一个单应性变换矩阵表达两个传感器之间的坐标关系引起的误差,从而能够实现对较大标定距离目标探测的空间对准;1) By segmenting the larger calibration distance, the homography transformation matrix between the camera and the millimeter-wave radar coordinate system is obtained for each segment, which avoids the use of the same homography in the prior art The transformation matrix expresses the error caused by the coordinate relationship between the two sensors, so that the spatial alignment of the target detection at a larger calibration distance can be realized;
2)通过推导表征摄像机和毫米波雷达的不同坐标系之间的关系,最后采用单应变换矩阵N表征两者的坐标系关系,用两个传感器分别获得的目标数据再解得单应变换矩阵N,避免了求解缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,大大简化了运算过程,节省运算时间;2) By deriving and characterizing the relationship between the different coordinate systems of the camera and the millimeter-wave radar, the homography transformation matrix N is used to characterize the relationship between the two coordinate systems, and the homography transformation matrix is obtained by using the target data obtained by the two sensors respectively N, which avoids solving the camera internal parameter matrix composed of scaling factor, focal length, etc., and the camera external parameter matrix composed of rotation matrix and translation vector, which greatly simplifies the operation process and saves operation time;
3)根据无人车中的传感器对近距和远距目标的关注程度不同,对近距和远距的分段细度不同,在保证空间对准的同时,还能减小计算量;3) According to the different degrees of attention of the sensors in the unmanned vehicle to the short-range and long-range targets, the segmentation fineness of the short-range and long-range is different, while ensuring the spatial alignment, it can also reduce the amount of calculation;
4)在图像坐标系中,获得了目标的边缘图像后,分别找到两个轴方向的最大值和最小值的像素点,将该4个像素点围成四边形,再找到该四边形的质心,该方法能够快速确定每个目标边缘图像的质心,从而能够简化运算过程,同时还能使得空间对准更加准确。4) In the image coordinate system, after obtaining the edge image of the target, find the pixel points with the maximum value and minimum value in the two axis directions respectively, enclose the 4 pixel points into a quadrilateral, and then find the centroid of the quadrilateral, the The method can quickly determine the centroid of each target edge image, thereby simplifying the calculation process and making the spatial alignment more accurate.
附图说明Description of drawings
图1为摄像头针孔模型示意图;Figure 1 is a schematic diagram of a camera pinhole model;
图2为毫米波雷达坐标系示意图;Figure 2 is a schematic diagram of the millimeter wave radar coordinate system;
图3图像坐标系和毫米波雷达直角坐标系映射关系示意图。Fig. 3 Schematic diagram of the mapping relationship between the image coordinate system and the millimeter-wave radar Cartesian coordinate system.
具体实施方式Detailed ways
下面结合附图并举实施例,对本发明进行详细描述。The present invention will be described in detail below with reference to the accompanying drawings and examples.
本发明提供了一种基于单应性变换矩阵的分段空间对准方法,包括如下步骤:The present invention provides a method for segmented spatial alignment based on a homography transformation matrix, comprising the following steps:
步骤1:建立摄像机坐标系与毫米波雷达坐标系之间的基于单应性变换矩阵的关系:Step 1: Establish the relationship between the camera coordinate system and the millimeter-wave radar coordinate system based on the homography transformation matrix:
如图1所示,OXcYcZc表示摄像机坐标系,原点O位于摄像机的光心;Xc轴平行于摄像机扫描行方向,指向扫描像素增大的方向;Yc轴垂直于摄像机扫描行方向,指向扫描行增大的方向;Zc轴垂直于成像平面,指向摄像头视线方向。O′uv表示以像素为单位的图像坐标系,O′位于成像平面的左上角;u轴与Xc平行;v轴与Yc平行。O″xy表示以毫米为单位的图像坐标系,O″为成像平面的焦点;x轴与Xc平行;y轴与Yc平行。f是摄像头的焦距,I表示成像平面。假设点P在坐标系OXcYcZc、O″xy和O′uv下的坐标分别为(Xc,Yc,Zc),(x,y)和(u,v),由P点在坐标系OXcYcZc和坐标系O″xy下的几何比例关系: 将上述关系表示成齐次形式有:As shown in Figure 1, OX c Y c Z c represents the camera coordinate system, and the origin O is located at the optical center of the camera; the X c axis is parallel to the direction of the camera scanning line, pointing to the direction in which the scanning pixels increase; the Y c axis is perpendicular to the camera scanning The row direction points to the direction in which the scanning line increases; the Z c axis is perpendicular to the imaging plane and points to the camera line of sight direction. O'uv represents the image coordinate system in units of pixels, O' is located at the upper left corner of the imaging plane; the u axis is parallel to X c ; the v axis is parallel to Y c . O″xy represents the image coordinate system in millimeters, O″ is the focal point of the imaging plane; the x axis is parallel to X c ; the y axis is parallel to Y c . f is the focal length of the camera, and I represents the imaging plane. Assuming that the coordinates of point P in the coordinate system OX c Y c Z c , O″xy and O′uv are (X c , Y c , Z c ), (x, y) and (u, v), respectively, by P The geometric proportional relationship of a point in the coordinate system OX c Y c Z c and the coordinate system O″xy: Expressing the above relationship in a homogeneous form:
由P点在坐标系O″xy和坐标系O′uv坐标系下的缩放和平移关系:x=S(u-u0),y=S(v-v0),表示成齐次形式有:The scaling and translation relationship of point P in the coordinate system O″xy and the coordinate system O′uv coordinate system: x=S(uu 0 ), y=S(vv 0 ), expressed in a homogeneous form:
其中S为缩放比例因子,(u0,v0)为坐标系O″xy原点O″在坐标系O′uv下的坐标。假定世界坐标系O″′XwYwZw,点P在此坐标系下的坐标为(Xw,Yw,Zw),将两坐标系的关系用齐次形式表示有:Wherein, S is the scaling factor, and (u 0 , v 0 ) is the coordinates of the coordinate system O″xy origin O″ in the coordinate system O′uv. Assuming the world coordinate system O″′X w Y w Z w , the coordinates of point P in this coordinate system are (X w , Y w , Z w ), and the relationship between the two coordinate systems is expressed in a homogeneous form:
其中R和T分别表示旋转矩阵和平移向量。综合式(1)、式(2)和式(3),则有关系:where R and T represent the rotation matrix and translation vector, respectively. Combining formula (1), formula (2) and formula (3), there is a relationship:
其中,β=Zc·S。Wherein, β=Z c ·S.
如图2所示,O″″XrYrZr表示毫米波雷达直角坐标系,O″″为毫米波雷达表面的中心;Yr轴为毫米波雷达扫描平面中心线,垂直于毫米波雷达表面,指向正前方;Xr轴与Yr垂直,指向右侧;Zr轴垂直于Xr、Yr确定的平面,指向上方。O″″ρθ表示毫米波雷达极坐标系,原点与坐标系O″″XrYrZr的原点重合;ρ为目标与毫米波雷达间的直线距离;θ为目标偏离毫米波雷达扫描平面中心线的角度。点P在O″″ρθ与O″″XrYrZr下的坐标分别为(ρ,θ)和(Xr,Yr,Zr),由于毫米波雷达的扫描平面为坐标系O″″ρθ下一个二维平面,有Zr=0,点P在O″″ρθ和O″″XrYrZr下的三角关系表示成:As shown in Figure 2, O″″X r Y r Z r represents the rectangular coordinate system of the millimeter-wave radar, O″″ is the center of the surface of the millimeter-wave radar; Y r axis is the centerline of the millimeter-wave radar scanning plane, perpendicular to the millimeter-wave radar The radar surface points straight ahead; the X r axis is perpendicular to Y r and points to the right; the Z r axis is perpendicular to the plane determined by X r and Y r and points upward. O″″ρθ represents the polar coordinate system of the millimeter-wave radar, and the origin coincides with the origin of the coordinate system O″″X r Y r Z r ; ρ is the straight-line distance between the target and the millimeter-wave radar; θ is the deviation of the target from the scanning plane of the millimeter-wave radar The angle of the centerline. The coordinates of point P under O″″ρθ and O″″X r Y r Z r are (ρ, θ) and (X r , Y r , Z r ) respectively, since the scanning plane of the millimeter-wave radar is the coordinate system O A two-dimensional plane under ″″ρθ has Z r =0, and the triangular relationship of point P under O″″ρθ and O″″X r Y r Z r is expressed as:
假定毫米波雷达直角坐标系为世界坐标系,借助世界坐标系为中间变量并结合式(4)和式(5),则图像坐标系O′uv和毫米波极坐标系O″″ρθ关系表示为:Assuming that the rectangular coordinate system of the millimeter-wave radar is the world coordinate system, by using the world coordinate system as an intermediate variable and combining equations (4) and (5), the relationship between the image coordinate system O′uv and the millimeter-wave polar coordinate system O″″ρθ is expressed as for:
如图3所示,当摄像头和毫米波雷达观测同一目标时,可通过式(6)把毫米波雷达扫描到的目标点投影到摄像头采集的图像中。但需要对缩放因子S、摄像头焦距f等摄像头内部参数以及旋转矩阵R、平移向量T等外部参数进行估计,计算过程较复杂,为简化计算,利用一个等价的变换关系N表示O′uv和O″″ρθ的关系,表示为:As shown in Figure 3, when the camera and the millimeter-wave radar observe the same target, the target point scanned by the millimeter-wave radar can be projected into the image collected by the camera through formula (6). However, it is necessary to estimate internal camera parameters such as zoom factor S, camera focal length f, and external parameters such as rotation matrix R and translation vector T. The calculation process is more complicated. To simplify the calculation, an equivalent transformation relationship N is used to represent O'uv and The relation of O″″ρθ is expressed as:
其中,
步骤2:确定合适的标定距离并对标定距离进行合理分段:Step 2: Determine the appropriate calibration distance and divide the calibration distance reasonably:
我们把毫米波雷达极坐标系下量测的标定目标距离ρ称为标定距离,将标定距离L由近到远分为近距范围L1和远距范围L2,在近距范围L1内,将其均分成m1段,在远距范围L2内,将其均分成m2段,由于在无人车探测过程中,需要对较近的目标进行精确探测,而对于远距离目标只需大致的判断,因此,远距范围分段可以比对近距范围分段大一些,所示L1/m1小于L2/m2;We call the calibration target distance ρ measured in the millimeter-wave radar polar coordinate system as the calibration distance, and divide the calibration distance L from near to far into the short-range range L1 and the long-range range L2. Within the short-range range L1, it is Evenly divided into m1 segment, in the long-distance range L2, it is evenly divided into m2 segment, because in the process of unmanned vehicle detection, it is necessary to accurately detect the closer target, and only need a rough judgment for the long-distance target, so , the segment of the far range can be larger than the segment of the short range, and the L1/m1 shown is smaller than the L2/m2;
在本实施例中,无人车规定的最大速度在36km/h,根据无人车在行驶过程中关注的距离区域与其速度大小相关的关系可知,我们确定50m为合适的标定距离。在50m的标定距离内,通过实验得知:利用单一的单应性空间变换矩阵会导致对准失败,因此,根据无人车速度要求以及关注距离区域的不同,在标定距离为0-20m内,间隔5m为一段共4段;在标定距离为20-50m内,间隔10m为一段共3段。In this embodiment, the maximum speed specified by the unmanned vehicle is 36km/h. According to the relationship between the distance area that the unmanned vehicle pays attention to and its speed during driving, we determine that 50m is an appropriate calibration distance. Within the calibration distance of 50m, it is known through experiments that the use of a single homography space transformation matrix will lead to alignment failure. Therefore, according to the speed requirements of the unmanned vehicle and the different distance areas of interest, within the calibration distance of 0-20m , with an interval of 5m for a total of 4 segments; within the calibration distance of 20-50m, an interval of 10m for a total of 3 segments.
步骤3:通过无人车中装载的摄像机和毫米波雷达分别采集标定目标的图像和数据信息:Step 3: Collect images and data information of the calibrated target through the camera and millimeter-wave radar mounted in the unmanned vehicle:
为保证空间对准的准确性,在每个分段中,分别在纵向取五行,每行在横向取7组标定目标对应的图像和数据信息。利用安装在无人车上的IEEE1394接口的摄像头和CAN总线接口的毫米波雷达采集同一场景下标定目标的图像和数据信息,并将信息传输到工控机中。In order to ensure the accuracy of spatial alignment, in each segment, five rows are taken vertically, and each row takes 7 sets of images and data information corresponding to the calibration target in the horizontal direction. Use the camera with IEEE1394 interface installed on the unmanned vehicle and the millimeter-wave radar with CAN bus interface to collect images and data information of the calibration target in the same scene, and transmit the information to the industrial computer.
步骤4:针对步骤3中摄像机获得的每一个分段内每一个小段的图像数据fk M,分别计算图像的质心坐标(uk M,vk M);Step 4: For the image data f k M of each small segment in each segment obtained by the camera in step 3, calculate the centroid coordinates (u k M , v k M ) of the image respectively;
一般,毫米波雷达采用的聚类方法是最近领域方法,因此,我们认为毫米波雷达在毫米波雷达坐标系下扫描的标定目标数据信息与图像坐标系下质心所在位置的坐标值相互对应。对数字图像f,f(i,j)表示灰度值,(i,j)表示图像区域中的点。计算图像信息中标定目标的质心坐标,采用如下步骤:Generally, the clustering method adopted by the millimeter-wave radar is the nearest field method. Therefore, we believe that the calibration target data information scanned by the millimeter-wave radar in the millimeter-wave radar coordinate system corresponds to the coordinate value of the centroid position in the image coordinate system. For a digital image f, f(i,j) represents a gray value, and (i,j) represents a point in the image region. To calculate the centroid coordinates of the calibration target in the image information, the following steps are taken:
S40、手动选取含有标定目标的候选区域,以减少背景对标定目标质心计算的干扰。S40. Manually select a candidate region containing the calibration target, so as to reduce the interference of the background on the calculation of the centroid of the calibration target.
S41、对候选区域进行中值滤波,消除图像中噪声干扰。中值滤波通过表达式f(i,j)=Median{f(i-k,j-l)实现。其中,(k,l)∈ω,ω为像素的3×3领域。S41. Perform median filtering on the candidate area to eliminate noise interference in the image. Median filtering is realized by the expression f(i,j)=Median{f(i-k,j-l). where (k, l) ∈ ω, ω is the 3×3 field of pixels.
S42、对候选区域图像进行Sobel算子边缘检测,得到含有标定目标的候选区域边缘图像。Sobel算子采用模版
S43、在以像素为单位的图像坐标系中,沿u轴寻找标定目标边缘图像中的坐标最小值和最大值的u轴像素点坐标umin,umax,沿v轴寻找标定目标边缘图像中的坐标最小值和最大值的v轴像素点坐标vmin,vmax,将上述4点按顺时针或者逆时针方向用直线连接,形成一个四边形区域,在该四边形区域内,利用公式
步骤5:求解表示毫米波雷达坐标系与摄像机坐标系之间关系的单应性空间变换矩阵:Step 5: Solve the homography space transformation matrix representing the relationship between the mmWave radar coordinate system and the camera coordinate system:
针对整个标定距离L中分出的每一段距离获得的所有小段对应的毫米波雷达坐标数据(YM rk)和摄像机的图像数据(uM k,vM k)组成每个分段内对应的数据集,将每个数据集分别代入到式(5)和(7)中,得到:The millimeter-wave radar coordinate data ( Y M rk ) and camera image data (u M k , v M k ) form the corresponding data set in each segment, and each data set is substituted into formulas (5) and (7) to obtain:
和
定义
步骤6:实现视觉传感器和毫米波雷达的空间对准:Step 6: Achieve spatial alignment of vision sensor and mmWave radar:
根据毫米波雷达扫描的标定目标的实际距离,首先,判断该距离在步骤2中哪个分段内,然后,根据步骤5计算得到的该分段内单应性空间变换矩阵的NM(M=1,2,...,7),实现空间对准。According to the actual distance of the calibration target scanned by the millimeter-wave radar, first, determine which segment the distance is in in step 2, and then, according to the N M of the homography space transformation matrix in the segment calculated in step 5 (M= 1,2,...,7), to achieve spatial alignment.
综上所述,以上仅为本发明的较佳实施例而已,并非用于限定本发明的保护范围。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。To sum up, the above are only preferred embodiments of the present invention, and are not intended to limit the protection scope of the present invention. Any modifications, equivalent replacements, improvements, etc. made within the spirit and principles of the present invention shall be included within the protection scope of the present invention.
Claims (3)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310013045.8A CN103065323B (en) | 2013-01-14 | 2013-01-14 | Subsection space aligning method based on homography transformational matrix |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310013045.8A CN103065323B (en) | 2013-01-14 | 2013-01-14 | Subsection space aligning method based on homography transformational matrix |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103065323A CN103065323A (en) | 2013-04-24 |
CN103065323B true CN103065323B (en) | 2015-07-15 |
Family
ID=48107940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310013045.8A Expired - Fee Related CN103065323B (en) | 2013-01-14 | 2013-01-14 | Subsection space aligning method based on homography transformational matrix |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103065323B (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104280019B (en) * | 2013-07-10 | 2017-10-27 | 德尔福电子(苏州)有限公司 | A kind of viewing system caliberating device based on flexible calibration plate |
CN104200483B (en) * | 2014-06-16 | 2018-05-18 | 南京邮电大学 | Object detection method based on human body center line in multi-cam environment |
CN104464173A (en) * | 2014-12-03 | 2015-03-25 | 国网吉林省电力有限公司白城供电公司 | Power transmission line external damage protection system based on space image three-dimensional measurement |
CN104965202B (en) * | 2015-06-18 | 2017-10-27 | 奇瑞汽车股份有限公司 | Obstacle detection method and device |
CN105818763B (en) * | 2016-03-09 | 2018-06-22 | 睿驰智能汽车(广州)有限公司 | A kind of method, apparatus and system of determining vehicle periphery object distance |
CN106730106B (en) * | 2016-11-25 | 2019-10-08 | 哈尔滨工业大学 | The coordinate scaling method of the micro-injection system of robot assisted |
CN108109173B (en) * | 2016-11-25 | 2022-06-28 | 宁波舜宇光电信息有限公司 | Visual positioning method, camera system and automation equipment |
CN108226906B (en) * | 2017-11-29 | 2019-11-26 | 深圳市易成自动驾驶技术有限公司 | A kind of scaling method, device and computer readable storage medium |
US11061132B2 (en) * | 2018-05-21 | 2021-07-13 | Johnson Controls Technology Company | Building radar-camera surveillance system |
CN110658518B (en) * | 2018-06-29 | 2022-01-21 | 杭州海康威视数字技术股份有限公司 | Target intrusion detection method and device |
CN110660186B (en) * | 2018-06-29 | 2022-03-01 | 杭州海康威视数字技术股份有限公司 | Method and device for identifying target object in video image based on radar signal |
CN109471096B (en) * | 2018-10-31 | 2023-06-27 | 奇瑞汽车股份有限公司 | Multi-sensor target matching method and device and automobile |
CN111538008B (en) * | 2019-01-18 | 2022-12-23 | 杭州海康威视数字技术股份有限公司 | Transformation matrix determining method, system and device |
CN110879598A (en) * | 2019-12-11 | 2020-03-13 | 北京踏歌智行科技有限公司 | Information fusion method and device of multiple sensors for vehicle |
CN111429530B (en) * | 2020-04-10 | 2023-06-02 | 浙江大华技术股份有限公司 | Coordinate calibration method and related device |
CN112162252B (en) * | 2020-09-25 | 2023-07-18 | 南昌航空大学 | A Data Calibration Method for Millimeter Wave Radar and Visible Light Sensor |
CN112348863B (en) * | 2020-11-09 | 2022-06-21 | Oppo广东移动通信有限公司 | Image alignment method, image alignment device and terminal equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033972A (en) * | 2007-02-06 | 2007-09-12 | 华中科技大学 | Method for obtaining three-dimensional information of space non-cooperative object |
CN101299270A (en) * | 2008-05-27 | 2008-11-05 | 东南大学 | Multiple video cameras synchronous quick calibration method in three-dimensional scanning system |
CN102062576A (en) * | 2010-11-12 | 2011-05-18 | 浙江大学 | Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8487993B2 (en) * | 2009-07-29 | 2013-07-16 | Ut-Battelle, Llc | Estimating vehicle height using homographic projections |
-
2013
- 2013-01-14 CN CN201310013045.8A patent/CN103065323B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033972A (en) * | 2007-02-06 | 2007-09-12 | 华中科技大学 | Method for obtaining three-dimensional information of space non-cooperative object |
CN101299270A (en) * | 2008-05-27 | 2008-11-05 | 东南大学 | Multiple video cameras synchronous quick calibration method in three-dimensional scanning system |
CN102062576A (en) * | 2010-11-12 | 2011-05-18 | 浙江大学 | Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof |
Non-Patent Citations (1)
Title |
---|
Advanced Obstacles Detection and Tracking by Fusing Millimeter Wave Radar and Image Sensor Data;Xianru Liu et al.;《International Conference on Control, Automation and Systems 2010》;20101030;第1115-1120页 * |
Also Published As
Publication number | Publication date |
---|---|
CN103065323A (en) | 2013-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103065323B (en) | Subsection space aligning method based on homography transformational matrix | |
CN109283538B (en) | Marine target size detection method based on vision and laser sensor data fusion | |
CN109270534B (en) | An online calibration method for smart car laser sensor and camera | |
CN104200086B (en) | Wide-baseline visible light camera pose estimation method | |
CN105445721B (en) | Based on laser radar and video camera combined calibrating method with feature protrusion V-type calibration object | |
CN103487034B (en) | Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target | |
CN110363158A (en) | A Neural Network-Based Collaborative Target Detection and Recognition Method of Millimeter-Wave Radar and Vision | |
CN108596058A (en) | Running disorder object distance measuring method based on computer vision | |
CN111815717B (en) | A Multi-sensor Fusion External Reference Joint Semi-autonomous Calibration Method | |
CN113985405B (en) | Obstacle detection method and obstacle detection device for vehicle | |
CN114413958A (en) | Monocular visual ranging and speed measurement method for unmanned logistics vehicles | |
CN107389026A (en) | A kind of monocular vision distance-finding method based on fixing point projective transformation | |
CN107133985A (en) | A kind of vehicle-mounted vidicon automatic calibration method for the point that disappeared based on lane line | |
CN110031829A (en) | A kind of targeting accuracy distance measuring method based on monocular vision | |
CN113223075A (en) | Ship height measuring system and method based on binocular camera | |
CN107796373B (en) | Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model | |
CN115267756A (en) | Monocular real-time distance measurement method based on deep learning target detection | |
CN111694011A (en) | Road edge detection method based on data fusion of camera and three-dimensional laser radar | |
CN113989766A (en) | Road edge detection method and road edge detection equipment applied to vehicle | |
CN113095324A (en) | Classification and distance measurement method and system for cone barrel | |
CN116403186B (en) | FPN Swin Transformer and Pointnet ++ based automatic driving three-dimensional target detection method | |
CN115792912A (en) | A method and system for unmanned surface vehicle environment perception based on fusion of vision and millimeter-wave radar under weak observation conditions | |
CN103204104A (en) | Vehicle full-view driving monitoring system and method | |
CN111788573A (en) | Sky Determination in Environmental Detection of Mobile Platforms and Related Systems and Methods | |
CN113627569A (en) | Data fusion method for radar video all-in-one machine used for traffic large scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20150715 |