Extrinsic Calibration and Fused Odometry for Monocular Camera and 3D LiDAR
XIAO Junhao1,2, SHI Chenghao1, HUANG Kaihong1, YU Qinghua1
1. College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, China; 2. School of Computer Science, University of Lincoln, Lincoln LN6 7TS, UK
Abstract:A two-stage extrinsic calibration method as well as a hybrid-residual-based odometry approach for monocular camera and 3D LiDAR (light detection and ranging) fusion are presented. The proposed two-stage camera-LiDAR extrinsic calibration method combines a motion-based approach and a mutual-information-based approach. At the first stage, a motion-based extrinsic parameter calibration approach is adopted to provide a coarse calibration result without a given initial guess. At the second stage, a mutual-information-based approach is adopted. Taking the results from the first stage as the initial values, the reflectivity information of LiDAR and the grey value information of camera images are registered based on the mutual information principle to refine the calibration result. To further improve the calibration accuracy, an occlusion detection algorithm is employed for sparse LiDAR point clouds at the second stage. The proposed two-stage extrinsic calibration method can guarantee a high calibration accuracy with no requirement for the initial guess. Based on the calibration, a hybrid-residual-based LiDAR-camera fused odometry is proposed. The proposed approach exploits both direct and indirect image features to calculate reprojection residuals and photometric residuals. The different residuals are then unified into a non-linear optimization framework for an odometry estimation. To deal with the depth information missing problem caused by sparse LiDAR data, a colour-based depth interpolation method is proposed, which effectively increases the number of feature points. Finally, experiments are conducted using both real-world and public datasets to evaluate the robustness and accuracy of the proposed extrinsic calibration and fused odometry algorithm. The results suggest that the proposed extrinsic calibration method can provide accurate extrinsic parameter estimation without initial values, and the fused odometry approach can achieve competitive estimation accuracy and robustness both on public and self-owned real-world datasets.
[1] Zhang J, Kaess M, Singh S. Real-time depth enhanced monocular odometry[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2014:49734980. [2] Andreasson H, Lilienthal A J. 6D scan registration using depthinterpolated local image features[J]. Robotics and Autonomous Systems, 2010, 58(2):157-165. [3] Zhang J, Singh S. Visual-lidar odometry and mapping:Lowdrift, robust, and fast[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2015:2174-2181. [4] Zhang Q, Pless R. Extrinsic calibration of a camera and laser range finder (improves camera calibration)[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2004:2301-2306. [5] Mei C, Rives P. Calibration between a central catadioptric camera and a laser range finder for robotic applications[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2006:532-537. [6] Nunez P, Drews Jr P, Rocha R, et al. Data fusion calibration for a 3D laser range finder and a camera using inertial data[C]//4th European Conference on Mobile Robots. 2013:31-36. [7] Mirzaei F M, Kottas D G, Roumeliotis S I. 3D lidar-camera intrinsic and extrinsic calibration:Identifiability and analytical least-squares-based initialization[J]. International Journal of Robotics Research, 2012, 31(4):452-467. [8] 温景阳,杨建,付梦印,等. 一种摄像机和3维激光雷达外部参数最大似然估计的标定算法[J].机器人, 2011, 33(1):102-106. Wen J Y, Yang J, Fu M Y, et al. A calibration algorithm for maximum likelihood estimation of extrinsic parameters of a camera and a 3D laser radar[J]. Robot, 2011, 33(1):102-106. [9] Zhou L P, Deng Z. Extrinsic calibration of a camera and a lidar based on decoupling the rotation from the translation[C]//IEEE Intelligent Vehicles Symposium. Piscataway, USA:IEEE, 2012:642-648. [10] Li G H, Liu Y H, Dong L, et al. An algorithm for extrinsic parameters calibration of a camera and a laser range finder using line features[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2007:38543859. [11] Fremont V, Bonnifait P. Extrinsic calibration between a multilayer lidar and a camera[C]//IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems. Piscataway, USA:IEEE, 2008:214-219. [12] Moghadam P, Bosse M, Zlot R. Line-based extrinsic calibration of range and image sensors[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2013:3685-3691. [13] Liao Q H, Liu M. Extrinsic calibration of 3D range finder and camera without auxiliary object or human intervention[C]//IEEE International Conference on Real-time Computing and Robotics. Piscataway, USA:IEEE, 2019:42-47. [14] Boughorbal F, Page D L, Dumont C, et al. Registration and integration of multisensor data for photorealistic scene reconstruction[C]//28th AIPR Workshop:3D Visualization for Data Exploration and Decision Making. Reston, USA:SPIE, 2000:74-84. [15] Napier A, Corke P, Newman P. Cross-calibration of push-broom 2D lidars and cameras in natural scenes[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2013:3679-3684. [16] Levinson J, Thrun S. Automatic online calibration of cameras and lasers[C]//Robotics:Science and Systems. 2013. DOI:10.15607/RSS.2013.IX.029. [17] Pandey G, McBride J R, Savarese S, et al. Automatic extrinsic calibration of vision and lidar by maximizing mutual information[J]. Journal of Field Robotics, 2015, 32(5):696-722. [18] Shiu Y C, Ahmad S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX=XB[J]. IEEE Transactions on Robotics and Automation, 1989, 5(1):16-29. [19] Strobl K H, Hirzinger G. Optimal hand-eye calibration[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2006:4647-4653. [20] Park F C, Martin B J. Robot sensor calibration:Solving AX=XB on the Euclidean group[J]. IEEE Transactions on Robotics and Automation, 1994, 10(5):717-721. [21] Daniilidis K. Hand-eye calibration using dual quaternions[J]. International Journal of Robotics Research, 1999, 18(3):286298. [22] Fassi I, Legnani G. Hand to sensor calibration:A geometrical interpretation of the matrix equation AX=XB[J]. Journal of Robotic Systems, 2005, 22(9):497-506. [23] Dornaika F, Horaud R. Simultaneous robot-world and hand-eye calibration[J]. IEEE Transactions on Robotics and Automation, 1998, 14(4):617-622. [24] Huang K, Stachniss C. Extrinsic multi-sensor calibration for mobile robots using the Gauss-Helmert model[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2017:1490-1496. [25] Pandey G, Savarese S, McBride J R, et al. Visually bootstrapped generalized ICP[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2011:2660-2667. [26] Joung J H, An K H, Kang J W, et al. 3D environment reconstruction using modified color ICP algorithm by fusion of a camera and a 3D laser range finder[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA:IEEE, 2009:3082-3088. [27] Men H, Gebre B, Pochiraju K. Color point cloud registration with 4D ICP algorithm[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2011:1511-1516. [28] Shin Y S, Park Y S, Kim A. Direct visual SLAM using sparse depth for camera-lidar system[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2018:5144-5151. [29] Barzilai J, Borwein J M. 2-point step size gradient methods[J]. IMA Journal of Numerical Analysis, 1988, 8(1):141-148. [30] Huang K, Xiao J, Stachniss C. Accurate direct visual-laser odometry with explicit occlusion handling and plane detection[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA:IEEE, 2019:1295-1301. [31] Rublee E, Rabaud V, Konolige K, et al. ORB:An efficient alternative to SIFT or SURF[C]//International Conference on Computer Vision. Piscataway, USA:IEEE, 2011:2564-2571. [32] Mur-Artal R, Tardós J D. ORB-SLAM2:An open-source SLAM system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics, 2017, 33(5):1255-1262. [33] Zhang J, Singh S. LOAM:Lidar odometry and mapping in real-time[C]//Robotics:Science and Systems. 2014. DOI:10. 15607/RSS.2014.X.007. [34] Pandey G, McBride J R, Eustice R M. Ford Campus vision and lidar data set[J]. International Journal of Robotics Research, 2011, 30(13):1543-1552. [35] Geiger A, Lenz P, Stiller C, et al. Vision meets robotics:The KITTI dataset[J]. International Journal of Robotics Research, 2013, 32(11):1231-1237.