高级检索
颜洪杰,朱志峰,蔡伯华,等. 二维激光雷达与相机数据融合标定[J]. 安徽工业大学学报(自然科学版),xxxx,x(x):x-xx. doi: 10.12415/j.issn.1671-7872.24039
引用本文: 颜洪杰,朱志峰,蔡伯华,等. 二维激光雷达与相机数据融合标定[J]. 安徽工业大学学报(自然科学版),xxxx,x(x):x-xx. doi: 10.12415/j.issn.1671-7872.24039
YAN Hongjie, ZHU Zhifeng, CAI Bohua, YAO Yong. 2D Lidar and Camera Data Fusion Calibration[J]. Journal of Anhui University of Technology(Natural Science). DOI: 10.12415/j.issn.1671-7872.24039
Citation: YAN Hongjie, ZHU Zhifeng, CAI Bohua, YAO Yong. 2D Lidar and Camera Data Fusion Calibration[J]. Journal of Anhui University of Technology(Natural Science). DOI: 10.12415/j.issn.1671-7872.24039

二维激光雷达与相机数据融合标定

2D Lidar and Camera Data Fusion Calibration

  • 摘要: 针对单线激光雷达和相机之间的外参标定速度慢、实时性差的不足,提出1种基于特征点的数据融合标定方法。在特殊设计的标靶上提取相机外参标定所需的像素特征点,通过激光雷达角度标定得到所需的点云特征点;采用特征点法融合标定像素与点云特征点,得到激光雷达数据与相机图像像素点对应的坐标序列,融合处理2种坐标序列计算出融合矩阵,且通过标定实验验证所提方法的可行性与有效性。结果表明:与传统标靶相比,采用特殊设计的标靶标定速度大约提高31.4%;结合数据融合的处理方法可得到单线激光雷达和相机的标定参数,数据融合的平均重投影误差为1.86像素、标准差为0.73像素,雷达点云数据能够在标靶上较好地投影,可使激光雷达点云与标靶图像成功匹配;通过融合矩阵只需较少对应的激光雷达点云和像素点,可避免计算标定的各个参数,标定过程更简便、快捷。

     

    Abstract: Aiming at the shortcomings of slow external calibration speed and poor real-time performance between single-line lidar and camera, a data fusion calibration method based on feature points was proposed. The pixel feature points required for camera external parameter calibration were extracted on the specially designed target, and then the required point cloud feature points were obtained by lidar angle calibration.The pixel feature points and the point cloud feature points were jointly calibrated by the feature point method to obtain the coordinate sequence corresponding to the laser radar data and the camera image pixel points. The two coordinate sequences were fused to calculate the fusion matrix. The feasibility and effectiveness of the proposed method were verified by calibration experiments. The results show that the calibration speed of the target with special designs is about 31.4% faster than that of the traditional target. Combined with the data fusion processing method, the calibration parameters of single-line lidar and camera can be obtained. The average reprojection error of data fusion is 1.86 pixels, and the standard deviation is 0.73 pixels. The radar point cloud data can be well projected on the target, which can successfully match the lidar point cloud with the target image. By fusing the matrix, fewer corresponding lidar point clouds and pixels are required, which can avoid calculating various calibration parameters and make the calibration process simpler and faster.

     

/

返回文章
返回