[go: up one dir, main page]

CN103065323B - Subsection space aligning method based on homography transformational matrix - Google Patents

Subsection space aligning method based on homography transformational matrix Download PDF

Info

Publication number
CN103065323B
CN103065323B CN201310013045.8A CN201310013045A CN103065323B CN 103065323 B CN103065323 B CN 103065323B CN 201310013045 A CN201310013045 A CN 201310013045A CN 103065323 B CN103065323 B CN 103065323B
Authority
CN
China
Prior art keywords
wave radar
mtr
mtd
distance
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310013045.8A
Other languages
Chinese (zh)
Other versions
CN103065323A (en
Inventor
付梦印
靳璐
杨毅
宗民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201310013045.8A priority Critical patent/CN103065323B/en
Publication of CN103065323A publication Critical patent/CN103065323A/en
Application granted granted Critical
Publication of CN103065323B publication Critical patent/CN103065323B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

本发明公开了一种基于单应性变换矩阵的分段空间对准方法,通过对较大的标定距离进行分段,针对每一分段分别求得摄像机和毫米波雷达坐标系之间的单应性变换矩阵,避免了现有技术中由于用同一个单应性变换矩阵表达两个传感器之间的坐标关系引起的误差,从而能够实现对较大标定距离目标探测的空间对准;通过推导表征摄像机和毫米波雷达的不同坐标系之间的关系,最后采用单应变换矩阵N表征两者的坐标系关系,用两个传感器分别获得的目标数据再解得单应变换矩阵N,避免了求解缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,大大简化了运算过程,节省运算时间。

The invention discloses a segmented space alignment method based on a homography transformation matrix. By segmenting a larger calibration distance, the homography between the camera and the millimeter-wave radar coordinate system is obtained for each segment. The homography transformation matrix avoids the error caused by using the same homography transformation matrix to express the coordinate relationship between the two sensors in the prior art, so as to realize the spatial alignment of the target detection with a large calibration distance; by deriving Characterize the relationship between the different coordinate systems of the camera and the millimeter-wave radar, and finally use the homography transformation matrix N to represent the relationship between the two coordinate systems, and use the target data obtained by the two sensors to solve the homography transformation matrix N, avoiding the Solving the camera internal parameter matrix composed of scaling factor, focal length, etc. and the camera external parameter matrix composed of rotation matrix and translation vector greatly simplifies the operation process and saves operation time.

Description

一种基于单应性变换矩阵的分段空间对准方法A Segmented Spatial Alignment Method Based on Homography Transformation Matrix

技术领域technical field

本发明涉及无人车多传感器信息融合技术领域,具体涉及一种基于单应性变换矩阵的分段空间对准方法。The invention relates to the technical field of multi-sensor information fusion of unmanned vehicles, in particular to a segmented space alignment method based on a homography transformation matrix.

背景技术Background technique

无人车又称室外智能移动机器人,是一种集环境感知、动态决策与规划、行为控制与执行等多功能为一体的智能化程度很高的装置,其对环境感知的快速性、准确性与多传感器信息融合技术密不可分。多传感器信息融合技术就是计算机充分利用各传感器资源,通过对各种量测信息的合理支配和使用,在空间和时间上把互补与冗余信息依据某种优化准则结合起来,产生对观测环境的一致性解释或描述,同时产生新的融合结果。在环境感知模块,视觉传感器和毫米波雷达是常用的两种传感器。视觉传感器检测范围广、能够获得外界环境中目标的尺寸和轮廓信息,但它易受外界因素影响,存在目标缺失问题。而毫米波雷达分辨率高、抗干扰能力强,可在各种天气环境下准确获得目标的距离、相对速度和方位角信息,但不能识别目标形状和大小,因此,利用这种互补特性融合两者信息,得到更加全面、可靠的环境信息。而空间对准是两者信息融合的前提。空间对准实质是估计摄像机和雷达坐标系间的变换矩阵关系。目前,传统空间对准方法中,在20米的标定距离范围内,在不同的距离点上,随机对目标上的点进行探测,分别得到目标在摄像机中的坐标系表达和雷达坐标系中的表达,根据两个传感器获取的数据,估计由缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,该计算过程比较繁琐,也容易引入误差;另外,根据上述算法对超过20米的标定距离的目标进行求解变换矩阵时,由于范围较大,因此巨大的误差,导致空间对准失败。Unmanned vehicles, also known as outdoor intelligent mobile robots, are highly intelligent devices that integrate environmental perception, dynamic decision-making and planning, behavior control and execution. It is inseparable from multi-sensor information fusion technology. Multi-sensor information fusion technology is that the computer makes full use of the resources of each sensor, through the reasonable control and use of various measurement information, and combines complementary and redundant information in space and time according to a certain optimization criterion, resulting in an accurate observation of the observation environment. Consistent explanations or descriptions, while generating new fusion results. In the environmental perception module, vision sensors and millimeter-wave radar are two commonly used sensors. The visual sensor has a wide detection range and can obtain the size and contour information of the target in the external environment, but it is easily affected by external factors, and there is a problem of missing targets. The millimeter-wave radar has high resolution and strong anti-interference ability, and can accurately obtain the distance, relative speed and azimuth information of the target in various weather environments, but cannot identify the shape and size of the target. Therefore, using this complementary feature to fuse the two information about the environment and obtain more comprehensive and reliable environmental information. Spatial alignment is the premise of information fusion between the two. The essence of spatial alignment is to estimate the transformation matrix relationship between the camera and radar coordinate systems. At present, in the traditional space alignment method, within the calibrated distance range of 20 meters, at different distance points, the points on the target are randomly detected, and the coordinate system expression of the target in the camera and the coordinate system in the radar coordinate system are respectively obtained. Expression, according to the data obtained by the two sensors, the camera internal parameter matrix composed of scaling factor, focal length, etc., and the camera external parameter matrix composed of rotation matrix and translation vector are estimated. The calculation process is cumbersome and easy to introduce errors; in addition, When solving the transformation matrix for a target with a calibration distance of more than 20 meters according to the above algorithm, due to the large range, there is a huge error, which leads to the failure of spatial alignment.

发明内容Contents of the invention

有鉴于此,本发明提供了一种基于单应性变换矩阵的分段空间对准方法,能够在较大的标定距离范围内实现无人车装载的摄像机和毫米波雷达的空间对准,同时还能简化求解单应性变换矩阵的计算过程。In view of this, the present invention provides a segmented spatial alignment method based on the homography transformation matrix, which can realize the spatial alignment of the camera mounted on the unmanned vehicle and the millimeter-wave radar within a large calibration distance range, and at the same time It can also simplify the calculation process of solving the homography transformation matrix.

本发明的一种基于单应性变换矩阵的分段空间对准方法,包括如下步骤:A kind of segmentation space alignment method based on homography transformation matrix of the present invention, comprises the following steps:

步骤1:建立摄像机坐标系与毫米波雷达坐标系之间的基于单应性变换矩阵的关系:Step 1: Establish the relationship between the camera coordinate system and the millimeter-wave radar coordinate system based on the homography transformation matrix:

定义摄像机的图像坐标系O′uv,其中O′位于摄像机成像平面的左上角;u轴与摄像机扫描行方向平行;v轴垂直于摄像机扫描行方向;Define the image coordinate system O'uv of the camera, where O' is located at the upper left corner of the camera imaging plane; the u-axis is parallel to the direction of the camera scanning line; the v-axis is perpendicular to the direction of the camera scanning line;

定义O″″ρθ为毫米波雷达极坐标系,O″″为毫米波雷达表面的中心;ρ为目标与毫米波雷达间的直线距离;θ为目标偏离毫米波雷达扫描平面中心线的角度,则摄像机的图像坐标系O′uv与毫米波雷达极坐标系O″″ρθ之间关系表示为:Define O″″ρθ as the polar coordinate system of the millimeter-wave radar, O″″ is the center of the millimeter-wave radar surface; ρ is the straight-line distance between the target and the millimeter-wave radar; θ is the angle from the target to the centerline of the millimeter-wave radar scanning plane, Then the relationship between the image coordinate system O′uv of the camera and the polar coordinate system O″″ρθ of the millimeter-wave radar is expressed as:

uu vv 11 == NN ρρ sinsin θθ ρρ coscos θθ 11 -- -- -- (( 77 ))

其中, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , 定义为单应性变换矩阵;in, N = no 11 no 12 no 13 no twenty one no twenty two no twenty three no 31 no 32 no 33 , Defined as a homography transformation matrix;

步骤2:确定无人车与标定目标之间的合适的标定距离:Step 2: Determine the appropriate calibration distance between the unmanned vehicle and the calibration target:

定义O″″XrYrZr表示毫米波雷达直角坐标系,O″″为毫米波雷达表面的中心;Yr轴为毫米波雷达扫描平面中心线,垂直于毫米波雷达表面,指向正前方;Xr轴与Yr垂直,指向右侧;Zr轴垂直于Xr、Yr确定的平面,指向上方;Definition O″″X r Y r Z r represents the rectangular coordinate system of the millimeter-wave radar, O″″ is the center of the surface of the millimeter-wave radar; Y r axis is the centerline of the scanning plane of the millimeter-wave radar, perpendicular to the surface of the millimeter-wave radar, pointing to the positive Front; X r axis is perpendicular to Y r and points to the right; Z r axis is perpendicular to the plane determined by X r and Y r and points upward;

则毫米波雷达直角坐标系和毫米波雷达极坐标系之间的关系为:Then the relationship between the millimeter-wave radar rectangular coordinate system and the millimeter-wave radar polar coordinate system is:

Xx rr YY rr 11 == ρρ sinsin θθ ρρ coscos θθ 11 -- -- -- (( 77 )) ′′

标定目标与无人车的距离在毫米波雷达直角坐标系纵轴Yr上的投影称为标定距离;在毫米波雷达的探测范围内,根据无人车的最大运动速度,确定合适的标定距离L;将标定距离L由近到远分为近距范围L1和远距范围L2,在近距范围L1内,将其均分成m1段,在远距范围L2内,将其均分成m2段,且保证L1/m1小于L2/m2;The projection of the distance between the calibration target and the unmanned vehicle on the vertical axis Y r of the millimeter-wave radar rectangular coordinate system is called the calibration distance; within the detection range of the millimeter-wave radar, the appropriate calibration distance is determined according to the maximum movement speed of the unmanned vehicle L; Divide the calibration distance L from near to far into short-distance range L1 and long-distance range L2. In the short-distance range L1, divide it into m1 segments, and in the long-distance range L2, divide it into m2 segments. And ensure that L1/m1 is less than L2/m2;

步骤3:通过无人车中装载的摄像机和毫米波雷达分别采集标定目标的图像和数据信息:Step 3: Collect images and data information of the calibrated target through the camera and millimeter-wave radar mounted in the unmanned vehicle:

将标定目标分别放置在步骤2中将标定距离L分成的不同分段处,毫米波雷达和摄像机分别对上述m1+m2段距离处的目标进行探测,在探测时,针对每段距离处的目标,将目标沿Yr轴方向均分成m行,再将每一行沿Xr轴方向均分成h小段,控制毫米波雷达获取每个小段的坐标数据(XM rk,YM rk),控制摄像机拍摄每小段的图像数据fk M,其中M=1,...,(m1+m2),k=1,2,...,mh;Place the calibration targets in different segments that divide the calibration distance L in step 2. The millimeter-wave radar and camera detect the targets at the above m1+m2 distances respectively. When detecting, target at each distance , divide the target into m rows along the Y r axis direction, and then divide each row into h segments along the X r axis direction, control the millimeter wave radar to obtain the coordinate data (X M rk , Y M rk ) of each segment, and control the camera Take image data f k M of each small segment, where M=1,...,(m1+m2), k=1,2,...,mh;

步骤4:针对步骤3中摄像机获得的每一个分段内每一个小段的图像数据fk M,分别计算图像的质心坐标(uk M,vk M);Step 4: For the image data f k M of each small segment in each segment obtained by the camera in step 3, calculate the centroid coordinates (u k M , v k M ) of the image respectively;

步骤5:求解表示毫米波雷达坐标系与摄像机坐标系之间关系的单应性空间变换矩阵:Step 5: Solve the homography space transformation matrix representing the relationship between the mmWave radar coordinate system and the camera coordinate system:

针对整个标定距离L中分出的每一段距离获得的所有小段对应的毫米波雷达坐标数据(YM rk)和摄像机的图像数据(uM k,vM k)组成每个分段内对应的数据集,将每个数据集分别代入到式(7)和(7)′中,得到:The millimeter-wave radar coordinate data ( Y M rk ) and camera image data (u M k , v M k ) constitute the corresponding data set in each segment, and each data set is substituted into formula (7) and (7)′ respectively to obtain:

uu 11 Mm ·· ·· ·&Center Dot; uu kk Mm == Xx rr 11 Mm YY rr 11 Mm 11 ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·· ·&Center Dot; Xx rkrk Mm YY rkrk Mm 11 nno 1111 Mm nno 1212 Mm nno 1313 Mm -- -- -- (( 88 ))

vv 11 Mm ·&Center Dot; ·&Center Dot; ·&Center Dot; vv kk Mm == Xx rr 11 Mm YY rr 11 Mm 11 ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; Xx rkrk Mm YY rkrk Mm 11 nno 21twenty one Mm nno 22twenty two Mm nno 23twenty three Mm -- -- -- (( 99 ))

1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 ) and 1 · &Center Dot; &Center Dot; 1 = x r 1 m Y r 1 m 1 · · · &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; x rk m Y rk m 1 no 31 m no 32 m no 33 m - - - ( 10 )

定义 P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , Ik×1=[1…1]T,则单应性空间变换矩阵NM的最小二乘解可表示为: N M = N 1 M T N 2 M T N 3 M T T , 其中,分别为: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M N 3 = ( P M P M T ) - 1 P M T I k × 1 ; definition P m = x r 1 m Y r 1 m 1 &Center Dot; &Center Dot; &Center Dot; · · &Center Dot; · · &Center Dot; x rk m Y rk m 1 , N m = no 11 m no twenty one m no 31 m no 12 m no twenty two m no 32 m no 13 m no twenty three m no 33 m T , u m = u 1 m &Center Dot; &Center Dot; &Center Dot; u k m T , V m = v 1 m &Center Dot; · &Center Dot; v k m T , I k×1 =[1…1] T , then the least squares solution of the homography space transformation matrix N M can be expressed as: N m = N 1 m T N 2 m T N 3 m T T , in, and They are: N 1 m = ( P m P m T ) - 1 P m T u m , N 2 m = ( P m P m T ) - 1 P m T V m and N 3 = ( P m P m T ) - 1 P m T I k × 1 ;

步骤6:实现视觉传感器和毫米波雷达的空间对准:Step 6: Achieve spatial alignment of vision sensor and mmWave radar:

根据毫米波雷达扫描的标定目标的实际距离,判断该距离在步骤2中哪个分段内,在步骤5计算得到的m1+m2个结果中查找该距离对应的单应性空间变换矩阵,实现空间对准。According to the actual distance of the calibration target scanned by the millimeter-wave radar, determine which segment the distance is in in step 2, and find the homography space transformation matrix corresponding to the distance in the m1+m2 results calculated in step 5 to realize the space alignment.

在步骤2中,当标定距离为50米时,0-20米为近距范围,均分成4段;20-50米为远距范围,均分成3段。In step 2, when the calibration distance is 50 meters, 0-20 meters is the short-range range, which is divided into 4 segments; 20-50 meters is the long-range range, which is divided into 3 segments.

所述步骤4中计算标定目标图像中的每一小段图像的质心坐标的方法如下:The method for calculating the centroid coordinates of each small segment image in the calibration target image in the step 4 is as follows:

S40、手动选取含有标定目标的候选区域;S40. Manually select a candidate area containing the calibration target;

S41、对候选区域图像进行中值滤波,消除图像中的噪声;S41. Perform median filtering on the image of the candidate area to eliminate noise in the image;

S42、对候选区域图像进行Sobel算子边缘检测,得到二值化的标定目标边缘图像;S42. Perform Sobel operator edge detection on the candidate area image to obtain a binarized calibration target edge image;

S43、在以像素为单位的图像坐标系中,沿u轴寻找标定目标边缘图像中的坐标最小值和最大值的u轴像素点坐标umin,umax,沿v轴寻找标定目标边缘图像中的坐标最小值和最大值的v轴像素点坐标vmin,vmax,将上述4点按顺时针或者逆时针方向用直线连接,形成一个四边形区域,在该四边形区域内,利用公式 u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) 计算标定目标的质心坐标(uk M,vk M),其中fk M(i,j)表示第M个距离分段上目标的第k个小段对应的四边形区域内像素点(i,j)的灰度值。S43. In the image coordinate system in units of pixels, find the u-axis pixel coordinates u min and u max of the coordinate minimum and maximum values in the edge image of the calibration target along the u-axis, and find the coordinates u min and u max of the edge image of the calibration target along the v-axis The v-axis pixel coordinates v min and v max of the minimum and maximum coordinates of the coordinates, connect the above four points with a straight line clockwise or counterclockwise to form a quadrilateral area. In this quadrilateral area, use the formula u k m = Σ i Σ j if k m ( i , j ) Σ i Σ j f k m ( i , j ) and v m k = Σ i Σ j if k m ( i , j ) Σ i Σ j f k m ( i , j ) Calculate the centroid coordinates (u k M , v k M ) of the calibration target, where f k M (i, j) represents the pixel point (i, j ) gray value.

本发明具有如下有益效果:The present invention has following beneficial effect:

1)通过对较大的标定距离进行分段,针对每一分段分别求得摄像机和毫米波雷达坐标系之间的单应性变换矩阵,避免了现有技术中由于用同一个单应性变换矩阵表达两个传感器之间的坐标关系引起的误差,从而能够实现对较大标定距离目标探测的空间对准;1) By segmenting the larger calibration distance, the homography transformation matrix between the camera and the millimeter-wave radar coordinate system is obtained for each segment, which avoids the use of the same homography in the prior art The transformation matrix expresses the error caused by the coordinate relationship between the two sensors, so that the spatial alignment of the target detection at a larger calibration distance can be realized;

2)通过推导表征摄像机和毫米波雷达的不同坐标系之间的关系,最后采用单应变换矩阵N表征两者的坐标系关系,用两个传感器分别获得的目标数据再解得单应变换矩阵N,避免了求解缩放比例因子、焦距等构成的摄像机内部参数矩阵以及旋转矩阵、平移向量构成的摄像机外部参数矩阵,大大简化了运算过程,节省运算时间;2) By deriving and characterizing the relationship between the different coordinate systems of the camera and the millimeter-wave radar, the homography transformation matrix N is used to characterize the relationship between the two coordinate systems, and the homography transformation matrix is obtained by using the target data obtained by the two sensors respectively N, which avoids solving the camera internal parameter matrix composed of scaling factor, focal length, etc., and the camera external parameter matrix composed of rotation matrix and translation vector, which greatly simplifies the operation process and saves operation time;

3)根据无人车中的传感器对近距和远距目标的关注程度不同,对近距和远距的分段细度不同,在保证空间对准的同时,还能减小计算量;3) According to the different degrees of attention of the sensors in the unmanned vehicle to the short-range and long-range targets, the segmentation fineness of the short-range and long-range is different, while ensuring the spatial alignment, it can also reduce the amount of calculation;

4)在图像坐标系中,获得了目标的边缘图像后,分别找到两个轴方向的最大值和最小值的像素点,将该4个像素点围成四边形,再找到该四边形的质心,该方法能够快速确定每个目标边缘图像的质心,从而能够简化运算过程,同时还能使得空间对准更加准确。4) In the image coordinate system, after obtaining the edge image of the target, find the pixel points with the maximum value and minimum value in the two axis directions respectively, enclose the 4 pixel points into a quadrilateral, and then find the centroid of the quadrilateral, the The method can quickly determine the centroid of each target edge image, thereby simplifying the calculation process and making the spatial alignment more accurate.

附图说明Description of drawings

图1为摄像头针孔模型示意图;Figure 1 is a schematic diagram of a camera pinhole model;

图2为毫米波雷达坐标系示意图;Figure 2 is a schematic diagram of the millimeter wave radar coordinate system;

图3图像坐标系和毫米波雷达直角坐标系映射关系示意图。Fig. 3 Schematic diagram of the mapping relationship between the image coordinate system and the millimeter-wave radar Cartesian coordinate system.

具体实施方式Detailed ways

下面结合附图并举实施例,对本发明进行详细描述。The present invention will be described in detail below with reference to the accompanying drawings and examples.

本发明提供了一种基于单应性变换矩阵的分段空间对准方法,包括如下步骤:The present invention provides a method for segmented spatial alignment based on a homography transformation matrix, comprising the following steps:

步骤1:建立摄像机坐标系与毫米波雷达坐标系之间的基于单应性变换矩阵的关系:Step 1: Establish the relationship between the camera coordinate system and the millimeter-wave radar coordinate system based on the homography transformation matrix:

如图1所示,OXcYcZc表示摄像机坐标系,原点O位于摄像机的光心;Xc轴平行于摄像机扫描行方向,指向扫描像素增大的方向;Yc轴垂直于摄像机扫描行方向,指向扫描行增大的方向;Zc轴垂直于成像平面,指向摄像头视线方向。O′uv表示以像素为单位的图像坐标系,O′位于成像平面的左上角;u轴与Xc平行;v轴与Yc平行。O″xy表示以毫米为单位的图像坐标系,O″为成像平面的焦点;x轴与Xc平行;y轴与Yc平行。f是摄像头的焦距,I表示成像平面。假设点P在坐标系OXcYcZc、O″xy和O′uv下的坐标分别为(Xc,Yc,Zc),(x,y)和(u,v),由P点在坐标系OXcYcZc和坐标系O″xy下的几何比例关系: 将上述关系表示成齐次形式有:As shown in Figure 1, OX c Y c Z c represents the camera coordinate system, and the origin O is located at the optical center of the camera; the X c axis is parallel to the direction of the camera scanning line, pointing to the direction in which the scanning pixels increase; the Y c axis is perpendicular to the camera scanning The row direction points to the direction in which the scanning line increases; the Z c axis is perpendicular to the imaging plane and points to the camera line of sight direction. O'uv represents the image coordinate system in units of pixels, O' is located at the upper left corner of the imaging plane; the u axis is parallel to X c ; the v axis is parallel to Y c . O″xy represents the image coordinate system in millimeters, O″ is the focal point of the imaging plane; the x axis is parallel to X c ; the y axis is parallel to Y c . f is the focal length of the camera, and I represents the imaging plane. Assuming that the coordinates of point P in the coordinate system OX c Y c Z c , O″xy and O′uv are (X c , Y c , Z c ), (x, y) and (u, v), respectively, by P The geometric proportional relationship of a point in the coordinate system OX c Y c Z c and the coordinate system O″xy: Expressing the above relationship in a homogeneous form:

ZZ cc xx ythe y 11 == ff 00 00 00 00 ff 00 00 00 00 11 00 Xx cc YY cc ZZ cc 11 -- -- -- (( 11 ))

由P点在坐标系O″xy和坐标系O′uv坐标系下的缩放和平移关系:x=S(u-u0),y=S(v-v0),表示成齐次形式有:The scaling and translation relationship of point P in the coordinate system O″xy and the coordinate system O′uv coordinate system: x=S(uu 0 ), y=S(vv 0 ), expressed in a homogeneous form:

xx ythe y 11 == SS 11 00 -- uu 00 00 11 -- vv 00 00 00 11 uu vv 11 -- -- -- (( 22 ))

其中S为缩放比例因子,(u0,v0)为坐标系O″xy原点O″在坐标系O′uv下的坐标。假定世界坐标系O″′XwYwZw,点P在此坐标系下的坐标为(Xw,Yw,Zw),将两坐标系的关系用齐次形式表示有:Wherein, S is the scaling factor, and (u 0 , v 0 ) is the coordinates of the coordinate system O″xy origin O″ in the coordinate system O′uv. Assuming the world coordinate system O″′X w Y w Z w , the coordinates of point P in this coordinate system are (X w , Y w , Z w ), and the relationship between the two coordinate systems is expressed in a homogeneous form:

Xx cc YY cc ZZ cc 11 == RR TT 00 11 Xx ww YY ww ZZ ww 11 -- -- -- (( 33 ))

其中R和T分别表示旋转矩阵和平移向量。综合式(1)、式(2)和式(3),则有关系:where R and T represent the rotation matrix and translation vector, respectively. Combining formula (1), formula (2) and formula (3), there is a relationship:

uu vv 11 == 11 ββ 11 00 uu 00 00 11 vv 00 00 00 11 ff 00 00 00 00 ff 00 00 00 00 11 00 RR TT 00 11 Xx ww YY ww ZZ ww 11 -- -- -- (( 44 ))

其中,β=Zc·S。Wherein, β=Z c ·S.

如图2所示,O″″XrYrZr表示毫米波雷达直角坐标系,O″″为毫米波雷达表面的中心;Yr轴为毫米波雷达扫描平面中心线,垂直于毫米波雷达表面,指向正前方;Xr轴与Yr垂直,指向右侧;Zr轴垂直于Xr、Yr确定的平面,指向上方。O″″ρθ表示毫米波雷达极坐标系,原点与坐标系O″″XrYrZr的原点重合;ρ为目标与毫米波雷达间的直线距离;θ为目标偏离毫米波雷达扫描平面中心线的角度。点P在O″″ρθ与O″″XrYrZr下的坐标分别为(ρ,θ)和(Xr,Yr,Zr),由于毫米波雷达的扫描平面为坐标系O″″ρθ下一个二维平面,有Zr=0,点P在O″″ρθ和O″″XrYrZr下的三角关系表示成:As shown in Figure 2, O″″X r Y r Z r represents the rectangular coordinate system of the millimeter-wave radar, O″″ is the center of the surface of the millimeter-wave radar; Y r axis is the centerline of the millimeter-wave radar scanning plane, perpendicular to the millimeter-wave radar The radar surface points straight ahead; the X r axis is perpendicular to Y r and points to the right; the Z r axis is perpendicular to the plane determined by X r and Y r and points upward. O″″ρθ represents the polar coordinate system of the millimeter-wave radar, and the origin coincides with the origin of the coordinate system O″″X r Y r Z r ; ρ is the straight-line distance between the target and the millimeter-wave radar; θ is the deviation of the target from the scanning plane of the millimeter-wave radar The angle of the centerline. The coordinates of point P under O″″ρθ and O″″X r Y r Z r are (ρ, θ) and (X r , Y r , Z r ) respectively, since the scanning plane of the millimeter-wave radar is the coordinate system O A two-dimensional plane under ″″ρθ has Z r =0, and the triangular relationship of point P under O″″ρθ and O″″X r Y r Z r is expressed as:

Xx rr YY rr ZZ rr == ρρ sinsin θθ ρρ coscos θθ 00 -- -- -- (( 55 ))

假定毫米波雷达直角坐标系为世界坐标系,借助世界坐标系为中间变量并结合式(4)和式(5),则图像坐标系O′uv和毫米波极坐标系O″″ρθ关系表示为:Assuming that the rectangular coordinate system of the millimeter-wave radar is the world coordinate system, by using the world coordinate system as an intermediate variable and combining equations (4) and (5), the relationship between the image coordinate system O′uv and the millimeter-wave polar coordinate system O″″ρθ is expressed as for:

uu vv 11 == 11 ββ 11 00 uu 00 00 11 vv 00 00 00 11 ff 00 00 00 00 ff 00 00 00 00 11 00 RR TT 00 11 ρρ sinsin θθ ρρ coscos θθ 00 11 -- -- -- (( 66 ))

如图3所示,当摄像头和毫米波雷达观测同一目标时,可通过式(6)把毫米波雷达扫描到的目标点投影到摄像头采集的图像中。但需要对缩放因子S、摄像头焦距f等摄像头内部参数以及旋转矩阵R、平移向量T等外部参数进行估计,计算过程较复杂,为简化计算,利用一个等价的变换关系N表示O′uv和O″″ρθ的关系,表示为:As shown in Figure 3, when the camera and the millimeter-wave radar observe the same target, the target point scanned by the millimeter-wave radar can be projected into the image collected by the camera through formula (6). However, it is necessary to estimate internal camera parameters such as zoom factor S, camera focal length f, and external parameters such as rotation matrix R and translation vector T. The calculation process is more complicated. To simplify the calculation, an equivalent transformation relationship N is used to represent O'uv and The relation of O″″ρθ is expressed as:

uu vv 11 == NN ρρ sinsin θθ ρρ coscos θθ 11 -- -- -- (( 77 ))

其中, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , 称为单应性变换矩阵。in, N = no 11 no 12 no 13 no twenty one no twenty two no twenty three no 31 no 32 no 33 , is called the homography transformation matrix.

步骤2:确定合适的标定距离并对标定距离进行合理分段:Step 2: Determine the appropriate calibration distance and divide the calibration distance reasonably:

我们把毫米波雷达极坐标系下量测的标定目标距离ρ称为标定距离,将标定距离L由近到远分为近距范围L1和远距范围L2,在近距范围L1内,将其均分成m1段,在远距范围L2内,将其均分成m2段,由于在无人车探测过程中,需要对较近的目标进行精确探测,而对于远距离目标只需大致的判断,因此,远距范围分段可以比对近距范围分段大一些,所示L1/m1小于L2/m2;We call the calibration target distance ρ measured in the millimeter-wave radar polar coordinate system as the calibration distance, and divide the calibration distance L from near to far into the short-range range L1 and the long-range range L2. Within the short-range range L1, it is Evenly divided into m1 segment, in the long-distance range L2, it is evenly divided into m2 segment, because in the process of unmanned vehicle detection, it is necessary to accurately detect the closer target, and only need a rough judgment for the long-distance target, so , the segment of the far range can be larger than the segment of the short range, and the L1/m1 shown is smaller than the L2/m2;

在本实施例中,无人车规定的最大速度在36km/h,根据无人车在行驶过程中关注的距离区域与其速度大小相关的关系可知,我们确定50m为合适的标定距离。在50m的标定距离内,通过实验得知:利用单一的单应性空间变换矩阵会导致对准失败,因此,根据无人车速度要求以及关注距离区域的不同,在标定距离为0-20m内,间隔5m为一段共4段;在标定距离为20-50m内,间隔10m为一段共3段。In this embodiment, the maximum speed specified by the unmanned vehicle is 36km/h. According to the relationship between the distance area that the unmanned vehicle pays attention to and its speed during driving, we determine that 50m is an appropriate calibration distance. Within the calibration distance of 50m, it is known through experiments that the use of a single homography space transformation matrix will lead to alignment failure. Therefore, according to the speed requirements of the unmanned vehicle and the different distance areas of interest, within the calibration distance of 0-20m , with an interval of 5m for a total of 4 segments; within the calibration distance of 20-50m, an interval of 10m for a total of 3 segments.

步骤3:通过无人车中装载的摄像机和毫米波雷达分别采集标定目标的图像和数据信息:Step 3: Collect images and data information of the calibrated target through the camera and millimeter-wave radar mounted in the unmanned vehicle:

为保证空间对准的准确性,在每个分段中,分别在纵向取五行,每行在横向取7组标定目标对应的图像和数据信息。利用安装在无人车上的IEEE1394接口的摄像头和CAN总线接口的毫米波雷达采集同一场景下标定目标的图像和数据信息,并将信息传输到工控机中。In order to ensure the accuracy of spatial alignment, in each segment, five rows are taken vertically, and each row takes 7 sets of images and data information corresponding to the calibration target in the horizontal direction. Use the camera with IEEE1394 interface installed on the unmanned vehicle and the millimeter-wave radar with CAN bus interface to collect images and data information of the calibration target in the same scene, and transmit the information to the industrial computer.

步骤4:针对步骤3中摄像机获得的每一个分段内每一个小段的图像数据fk M,分别计算图像的质心坐标(uk M,vk M);Step 4: For the image data f k M of each small segment in each segment obtained by the camera in step 3, calculate the centroid coordinates (u k M , v k M ) of the image respectively;

一般,毫米波雷达采用的聚类方法是最近领域方法,因此,我们认为毫米波雷达在毫米波雷达坐标系下扫描的标定目标数据信息与图像坐标系下质心所在位置的坐标值相互对应。对数字图像f,f(i,j)表示灰度值,(i,j)表示图像区域中的点。计算图像信息中标定目标的质心坐标,采用如下步骤:Generally, the clustering method adopted by the millimeter-wave radar is the nearest field method. Therefore, we believe that the calibration target data information scanned by the millimeter-wave radar in the millimeter-wave radar coordinate system corresponds to the coordinate value of the centroid position in the image coordinate system. For a digital image f, f(i,j) represents a gray value, and (i,j) represents a point in the image region. To calculate the centroid coordinates of the calibration target in the image information, the following steps are taken:

S40、手动选取含有标定目标的候选区域,以减少背景对标定目标质心计算的干扰。S40. Manually select a candidate region containing the calibration target, so as to reduce the interference of the background on the calculation of the centroid of the calibration target.

S41、对候选区域进行中值滤波,消除图像中噪声干扰。中值滤波通过表达式f(i,j)=Median{f(i-k,j-l)实现。其中,(k,l)∈ω,ω为像素的3×3领域。S41. Perform median filtering on the candidate area to eliminate noise interference in the image. Median filtering is realized by the expression f(i,j)=Median{f(i-k,j-l). where (k, l) ∈ ω, ω is the 3×3 field of pixels.

S42、对候选区域图像进行Sobel算子边缘检测,得到含有标定目标的候选区域边缘图像。Sobel算子采用模版 mask = - 1 - 2 - 1 0 0 0 1 2 1 , 将候选区域图像与mask作卷积运算,得到候选区域的二值图像。S42. Perform edge detection with a Sobel operator on the image of the candidate area to obtain an edge image of the candidate area containing the calibration target. Sobel operator adopts template mask = - 1 - 2 - 1 0 0 0 1 2 1 , The image of the candidate area is convolved with the mask to obtain the binary image of the candidate area.

S43、在以像素为单位的图像坐标系中,沿u轴寻找标定目标边缘图像中的坐标最小值和最大值的u轴像素点坐标umin,umax,沿v轴寻找标定目标边缘图像中的坐标最小值和最大值的v轴像素点坐标vmin,vmax,将上述4点按顺时针或者逆时针方向用直线连接,形成一个四边形区域,在该四边形区域内,利用公式 u k M = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) v M k = Σ i Σ j if k M ( i , j ) Σ i Σ j f k M ( i , j ) 计算标定目标的质心坐标(uk M,vk M),其中fk M(i,j)表示第M个距离分段上目标的第k个小段对应的四边形区域内像素点(i,j)的灰度值;S43. In the image coordinate system in units of pixels, find the u-axis pixel coordinates u min and u max of the coordinate minimum and maximum values in the edge image of the calibration target along the u-axis, and find the coordinates u min and u max of the edge image of the calibration target along the v-axis The v-axis pixel coordinates v min and v max of the minimum and maximum coordinates of the coordinates, connect the above four points with a straight line clockwise or counterclockwise to form a quadrilateral area. In this quadrilateral area, use the formula u k m = Σ i Σ j if k m ( i , j ) Σ i Σ j f k m ( i , j ) and v m k = Σ i Σ j if k m ( i , j ) Σ i Σ j f k m ( i , j ) Calculate the centroid coordinates (u k M , v k M ) of the calibration target, where f k M (i, j) represents the pixel point (i, j ) gray value;

步骤5:求解表示毫米波雷达坐标系与摄像机坐标系之间关系的单应性空间变换矩阵:Step 5: Solve the homography space transformation matrix representing the relationship between the mmWave radar coordinate system and the camera coordinate system:

针对整个标定距离L中分出的每一段距离获得的所有小段对应的毫米波雷达坐标数据(YM rk)和摄像机的图像数据(uM k,vM k)组成每个分段内对应的数据集,将每个数据集分别代入到式(5)和(7)中,得到:The millimeter-wave radar coordinate data ( Y M rk ) and camera image data (u M k , v M k ) form the corresponding data set in each segment, and each data set is substituted into formulas (5) and (7) to obtain:

uu 11 Mm ·&Center Dot; ·&Center Dot; ·&Center Dot; uu kk Mm == Xx rr 11 Mm YY rr 11 Mm 11 ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·· ·&Center Dot; ·&Center Dot; Xx rkrk Mm YY rkrk Mm 11 nno 1111 Mm nno 1212 Mm nno 1313 Mm -- -- -- (( 88 ))

vv 11 Mm ·&Center Dot; ·&Center Dot; ·&Center Dot; vv kk Mm == Xx rr 11 Mm YY rr 11 Mm 11 ·&Center Dot; ·&Center Dot; ·&Center Dot; ·&Center Dot; ·· ·· ·· ·&Center Dot; ·· Xx rkrk Mm YY rkrk Mm 11 nno 21twenty one Mm nno 22twenty two Mm nno 23twenty three Mm -- -- -- (( 99 ))

1 · · · 1 = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 ) and 1 &Center Dot; &Center Dot; &Center Dot; 1 = x r 1 m Y r 1 m 1 &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; x rk m Y rk m 1 no 31 m no 32 m no 33 m - - - ( 10 )

定义 P M = X r 1 M Y r 1 M 1 · · · · · · · · · X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M · · · u k M T , V M = v 1 M · · · v k M T , Ik×1=[1…1]T,则单应性空间变换矩阵NM的最小二乘解可表示为: N M = N 1 M T N 2 M T N 3 M T T , 其中,分别为: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M N 3 = ( P M P M T ) - 1 P M T I k × 1 ; definition P m = x r 1 m Y r 1 m 1 &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; &Center Dot; x rk m Y rk m 1 , N m = no 11 m no twenty one m no 31 m no 12 m no twenty two m no 32 m no 13 m no twenty three m no 33 m T , u m = u 1 m &Center Dot; &Center Dot; &Center Dot; u k m T , V m = v 1 m &Center Dot; · · v k m T , I k×1 =[1…1] T , then the least squares solution of the homography space transformation matrix N M can be expressed as: N m = N 1 m T N 2 m T N 3 m T T , in, and They are: N 1 m = ( P m P m T ) - 1 P m T u m , N 2 m = ( P m P m T ) - 1 P m T V m and N 3 = ( P m P m T ) - 1 P m T I k × 1 ;

步骤6:实现视觉传感器和毫米波雷达的空间对准:Step 6: Achieve spatial alignment of vision sensor and mmWave radar:

根据毫米波雷达扫描的标定目标的实际距离,首先,判断该距离在步骤2中哪个分段内,然后,根据步骤5计算得到的该分段内单应性空间变换矩阵的NM(M=1,2,...,7),实现空间对准。According to the actual distance of the calibration target scanned by the millimeter-wave radar, first, determine which segment the distance is in in step 2, and then, according to the N M of the homography space transformation matrix in the segment calculated in step 5 (M= 1,2,...,7), to achieve spatial alignment.

综上所述,以上仅为本发明的较佳实施例而已,并非用于限定本发明的保护范围。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。To sum up, the above are only preferred embodiments of the present invention, and are not intended to limit the protection scope of the present invention. Any modifications, equivalent replacements, improvements, etc. made within the spirit and principles of the present invention shall be included within the protection scope of the present invention.

Claims (3)

1. A segment space alignment method based on a homography transformation matrix is characterized by comprising the following steps:
step 1: establishing a homography transformation matrix-based relation between a camera coordinate system and a millimeter wave radar coordinate system:
defining an image coordinate system O 'uv of the camera, wherein O' is located in the upper left corner of the imaging plane of the camera; the u axis is parallel to the scanning line direction of the camera; the v-axis is perpendicular to the camera scan line direction;
defining O 'rho theta as a millimeter wave radar polar coordinate system, and O' as the center of the millimeter wave radar surface; rho is the linear distance between the target and the millimeter wave radar; theta is the angle of the target deviating from the central line of the scanning plane of the millimeter wave radar, the relation between the image coordinate system O 'uv of the camera and the millimeter wave radar polar coordinate system O' rho theta is expressed as follows:
<math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>u</mi> </mtd> </mtr> <mtr> <mtd> <mi>v</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>N</mi> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>&rho;</mi> <mi>sin</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mi>&rho;</mi> <mi>cos</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein, N = n 11 n 12 n 13 n 21 n 22 n 23 n 31 n 32 n 33 , defining a homography transformation matrix;
step 2: determining a proper calibration distance between the unmanned vehicle and a calibration target:
definition O "" XrYrZrThe rectangular coordinate system of the millimeter wave radar is represented, and O' is the center of the surface of the millimeter wave radar; y isrThe axis is the central line of a scanning plane of the millimeter wave radar, is vertical to the surface of the millimeter wave radar and points to the right front; xrAxis and YrVertical, pointing to the right; zrAxis perpendicular to Xr、YrA determined plane pointing upwards;
the relationship between the millimeter wave radar rectangular coordinate system and the millimeter wave radar polar coordinate system is as follows:
<math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>X</mi> <mi>r</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>Y</mi> <mi>r</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>&rho;</mi> <mi>sin</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mi>&rho;</mi> <mi>cos</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <msup> <mn>7</mn> <mo>&prime;</mo> </msup> <mo>)</mo> </mrow> </mrow> </math>
the distance between the calibration target and the unmanned vehicle is on the longitudinal axis Y of the millimeter wave radar rectangular coordinate systemrThe projection on the image is called a calibration distance; in the detection range of the millimeter wave radar, determining a proper calibration distance L according to the maximum movement speed of the unmanned vehicle; dividing the calibration distance L into a short-distance range L1 and a long-distance range L2 from near to far, dividing the calibration distance L into m1 sections in the short-distance range L1, dividing the calibration distance L into m2 sections in the long-distance range L2, and ensuring that L1/m1 is smaller than L2/m 2;
and step 3: respectively acquiring images and data information of a calibration target through a camera and a millimeter wave radar which are loaded in the unmanned vehicle:
respectively placing the calibration targets at different sections of which the calibration distance L is divided in the step 2, respectively detecting the targets at the distances of m1+ m2 by the millimeter wave radar and the camera, and aiming at the target at each section of distance during detection, enabling the target to be located along YrDividing the axial direction into m rows, and dividing each row along XrEqually dividing the axial direction into h sections, and controlling the millimeter wave radar to acquire coordinate data of each sectionControlling the camera to shoot the image data f of each small sectionk MWherein M ═ 1., (M1+ M2), k ═ 1, 2., (M2);
and 4, step 4: image data f for each segment in each segment obtained by the camera in step 3k MRespectively calculating the coordinates of the centroid of the image
And 5: solving a homography space transformation matrix representing the relation between the millimeter wave radar coordinate system and the camera coordinate system:
obtaining millimeter wave radar coordinate data corresponding to all small segments aiming at each segment of distance separated from the whole calibration distance LAnd the coordinates of the centroid of the imageForming a corresponding data set in each segment, and substituting each data set into equations (7) and (7)' respectively to obtain:
u 1 M . . . u k M = X r 1 M Y r 1 M 1 . . . . . . . . . X rk M Y rk M 1 n 11 M n 12 M n 13 M - - - ( 8 )
v 1 M . . . v k M = X r 1 M Y r 1 M 1 . . . . . . . . . X rk M Y rk M 1 n 21 M n 22 M n 23 M - - - ( 9 )
and 1 . . . 1 = X r 1 M Y r 1 M 1 . . . . . . . . . X rk M Y rk M 1 n 31 M n 32 M n 33 M - - - ( 10 )
definition of P M = X r 1 M Y r 1 M 1 . . . . . . . . . X rk M Y rk M 1 , N M = n 11 M n 21 M n 31 M n 12 M n 22 M n 32 M n 13 M n 23 M n 33 M T , U M = u 1 M . . . u k M T , V M = v 1 M . . . v k M T , Ik×1=[1 … 1]ΤThen homography spatial transformation matrix NMThe least squares solution of (d) is expressed as: N M = N 1 M T N 2 M T N 3 M T T , wherein,andrespectively as follows: N 1 M = ( P M P M T ) - 1 P M T U M , N 2 M = ( P M P M T ) - 1 P M T V M and <math> <mrow> <msub> <mi>N</mi> <mn>3</mn> </msub> <mo>=</mo> <msup> <mrow> <mo>(</mo> <msup> <mi>P</mi> <mi>M</mi> </msup> <msup> <mi>P</mi> <msup> <mi>M</mi> <mi>T</mi> </msup> </msup> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <msup> <mi>P</mi> <msup> <mi>M</mi> <mi>T</mi> </msup> </msup> <msub> <mi>I</mi> <mrow> <mi>k</mi> <mo>&times;</mo> <mn>1</mn> </mrow> </msub> <mo>;</mo> </mrow> </math>
step 6: the space alignment of the vision sensor and the millimeter wave radar is realized:
and (3) judging which segment the distance is in step 2 according to the actual distance of the calibration target scanned by the millimeter wave radar, and searching a homography space transformation matrix corresponding to the distance from m1+ m2 results obtained by calculation in step 5 to realize space alignment.
2. The method according to claim 1, wherein when the calibration distance in step 2 is 50m, the short distance range is 0-20m, and the segments are divided into 4 segments; the distance range is 20-50m, and the distance range is divided into 3 sections.
3. The method as claimed in claim 1, wherein the method of calculating the coordinates of the centroid of each small segment of the calibration target image in step 4 is as follows:
s40, manually selecting a candidate area containing a calibration target;
s41, carrying out median filtering on the candidate area image to eliminate the noise in the image;
s42, performing Sobel operator edge detection on the candidate region image to obtain a binarized calibration target edge image;
s43, in an image coordinate system with pixels as units, finding u-axis pixel point coordinates u of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the u-axismin,umaxFinding the v-axis pixel point coordinate v of the coordinate minimum value and the coordinate maximum value in the calibration target edge image along the v-axismin,vmaxConnecting the above 4 points with straight lines in clockwise or counterclockwise direction to form a quadrilateral region, and using formula in the quadrilateral regionAndcalculating the coordinates of the centroid (u) of the calibration targetk M,vk M) Wherein f isk MAnd (i, j) represents the gray value of a pixel point (i, j) in the quadrilateral region corresponding to the kth subsection of the target on the Mth distance subsection.
CN201310013045.8A 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix Expired - Fee Related CN103065323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310013045.8A CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Publications (2)

Publication Number Publication Date
CN103065323A CN103065323A (en) 2013-04-24
CN103065323B true CN103065323B (en) 2015-07-15

Family

ID=48107940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310013045.8A Expired - Fee Related CN103065323B (en) 2013-01-14 2013-01-14 Subsection space aligning method based on homography transformational matrix

Country Status (1)

Country Link
CN (1) CN103065323B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104280019B (en) * 2013-07-10 2017-10-27 德尔福电子(苏州)有限公司 A kind of viewing system caliberating device based on flexible calibration plate
CN104200483B (en) * 2014-06-16 2018-05-18 南京邮电大学 Object detection method based on human body center line in multi-cam environment
CN104464173A (en) * 2014-12-03 2015-03-25 国网吉林省电力有限公司白城供电公司 Power transmission line external damage protection system based on space image three-dimensional measurement
CN104965202B (en) * 2015-06-18 2017-10-27 奇瑞汽车股份有限公司 Obstacle detection method and device
CN105818763B (en) * 2016-03-09 2018-06-22 睿驰智能汽车(广州)有限公司 A kind of method, apparatus and system of determining vehicle periphery object distance
CN106730106B (en) * 2016-11-25 2019-10-08 哈尔滨工业大学 The coordinate scaling method of the micro-injection system of robot assisted
CN108109173B (en) * 2016-11-25 2022-06-28 宁波舜宇光电信息有限公司 Visual positioning method, camera system and automation equipment
CN108226906B (en) * 2017-11-29 2019-11-26 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
US11061132B2 (en) * 2018-05-21 2021-07-13 Johnson Controls Technology Company Building radar-camera surveillance system
CN110658518B (en) * 2018-06-29 2022-01-21 杭州海康威视数字技术股份有限公司 Target intrusion detection method and device
CN110660186B (en) * 2018-06-29 2022-03-01 杭州海康威视数字技术股份有限公司 Method and device for identifying target object in video image based on radar signal
CN109471096B (en) * 2018-10-31 2023-06-27 奇瑞汽车股份有限公司 Multi-sensor target matching method and device and automobile
CN111538008B (en) * 2019-01-18 2022-12-23 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN110879598A (en) * 2019-12-11 2020-03-13 北京踏歌智行科技有限公司 Information fusion method and device of multiple sensors for vehicle
CN111429530B (en) * 2020-04-10 2023-06-02 浙江大华技术股份有限公司 Coordinate calibration method and related device
CN112162252B (en) * 2020-09-25 2023-07-18 南昌航空大学 A Data Calibration Method for Millimeter Wave Radar and Visible Light Sensor
CN112348863B (en) * 2020-11-09 2022-06-21 Oppo广东移动通信有限公司 Image alignment method, image alignment device and terminal equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8487993B2 (en) * 2009-07-29 2013-07-16 Ut-Battelle, Llc Estimating vehicle height using homographic projections

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101033972A (en) * 2007-02-06 2007-09-12 华中科技大学 Method for obtaining three-dimensional information of space non-cooperative object
CN101299270A (en) * 2008-05-27 2008-11-05 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN102062576A (en) * 2010-11-12 2011-05-18 浙江大学 Device for automatically marking additional external axis robot based on laser tracking measurement and method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Advanced Obstacles Detection and Tracking by Fusing Millimeter Wave Radar and Image Sensor Data;Xianru Liu et al.;《International Conference on Control, Automation and Systems 2010》;20101030;第1115-1120页 *

Also Published As

Publication number Publication date
CN103065323A (en) 2013-04-24

Similar Documents

Publication Publication Date Title
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN109283538B (en) Marine target size detection method based on vision and laser sensor data fusion
CN109270534B (en) An online calibration method for smart car laser sensor and camera
CN104200086B (en) Wide-baseline visible light camera pose estimation method
CN105445721B (en) Based on laser radar and video camera combined calibrating method with feature protrusion V-type calibration object
CN103487034B (en) Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target
CN110363158A (en) A Neural Network-Based Collaborative Target Detection and Recognition Method of Millimeter-Wave Radar and Vision
CN108596058A (en) Running disorder object distance measuring method based on computer vision
CN111815717B (en) A Multi-sensor Fusion External Reference Joint Semi-autonomous Calibration Method
CN113985405B (en) Obstacle detection method and obstacle detection device for vehicle
CN114413958A (en) Monocular visual ranging and speed measurement method for unmanned logistics vehicles
CN107389026A (en) A kind of monocular vision distance-finding method based on fixing point projective transformation
CN107133985A (en) A kind of vehicle-mounted vidicon automatic calibration method for the point that disappeared based on lane line
CN110031829A (en) A kind of targeting accuracy distance measuring method based on monocular vision
CN113223075A (en) Ship height measuring system and method based on binocular camera
CN107796373B (en) Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model
CN115267756A (en) Monocular real-time distance measurement method based on deep learning target detection
CN111694011A (en) Road edge detection method based on data fusion of camera and three-dimensional laser radar
CN113989766A (en) Road edge detection method and road edge detection equipment applied to vehicle
CN113095324A (en) Classification and distance measurement method and system for cone barrel
CN116403186B (en) FPN Swin Transformer and Pointnet ++ based automatic driving three-dimensional target detection method
CN115792912A (en) A method and system for unmanned surface vehicle environment perception based on fusion of vision and millimeter-wave radar under weak observation conditions
CN103204104A (en) Vehicle full-view driving monitoring system and method
CN111788573A (en) Sky Determination in Environmental Detection of Mobile Platforms and Related Systems and Methods
CN113627569A (en) Data fusion method for radar video all-in-one machine used for traffic large scene

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150715