Abstract:Answering the high-precision positioning requirements of mobile robot in satellite-denied environments, an error-state extended Kalman filter (ES-EKF) based positioning method is proposed with loosely coupled fusion of subsystems of laser positioning, visual positioning, and global velocity measurement, and an integrated positioning system with low error drift is designed. Firstly, the error of system state is represented with vector addition and matrix multiplication in a minimum form, and a Kalman filter model in the error form is established, in which the optimal estimation of error state is used to compensate the estimated value of system state. Then, the pose output is transformed into the pose increment according to the time stamp, and the pose increment observation model is established, to deal with the problem of unknown pose uncertainty of the laser and visual positioning subsystems. Secondly, a global velocity measurement subsystem is constructed by using the attitude heading reference system (AHRS) and forward kinematics model, and a global velocity observation model is established, to make up the lack of global velocity constraint in the integrated positioning system. Finally, tests are carried out in street and field scenes, and results show that the relative positioning error of the proposed algorithm is less than 0.4%, which is about 40% lower than that of EKF and ES-EKF positioning algorithms with local velocity constraint. Experimental results demonstrate that the proposed method effectively improves the accuracy of the positioning system.
[1] 丁文东,徐德,刘希龙,等.移动机器人视觉里程计综述[J].自动化学报, 2018, 44(3):385-400.Ding W D, Xu D, Liu X L, et al.Review on visual odometry for mobile robots[J].Acta Automatica Sinica, 2018, 44(3):385-400. [2] Li D C, Li Q, Tang L W, et al.Invariant observer-based state estimation for micro-aerial vehicles in GPS-denied indoor environments using an RGB-D camera and MEMS inertial sensors[J].Micromachines, 2015, 6(4):487-522. [3] Layh T, Larson J, Gebre-Egziabher D, et al.GPS-denied navigator for small UAVs[R].Minneapolis, USA:UAV Laboratory, University of Minnesota, 2014. [4] 赖际舟,袁诚,吕品,等.不依赖于卫星的无人系统视觉/激光雷达感知与自主导航技术[J].导航定位与授时, 2021, 8(3):1-14.Lai J Z, Yuan C, Lü P, et al.Unmanned system visual/LiDAR perception and navigation technology independent of GNSS[J].Navigation Positioning and Timing, 2021, 8(3):1-14. [5] Zhang J, Singh S.LOAM:Lidar odometry and mapping in real-time[C]//Robotics:Science and Systems.Cambridge, USA:MIT Press, 2014.DOI:10.15607/RSS.12014.X.15007. [6] Geiger A, Lenz P, Urtasun R.Are we ready for autonomous driving?The KITTI vision benchmark suite[C]//IEEE Conference on Computer Vision and Pattern Recognition.Piscataway, USA:IEEE, 2012:3354-3361. [7] Shan T X, Englot B.LeGo-LOAM:Lightweight and groundoptimized LiDAR odometry and mapping on variable terrain[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Piscataway, USA:IEEE, 2018:4758-4765. [8] Qin T, Li P L, Shen S J.VINS-Mono:A robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics, 2018, 34(4):1004-1020. [9] Qin T, Pan J, Cao S Z, et al.A general optimization-based framework for local odometry estimation with multiple sensors[DB/OL].(2019-1-11)[2020-10-10].https://arxiv.org/pdf/1901.03638v1.pdf. [10] Zhang J, Singh S.Visual-lidar odometry and mapping:Lowdrift, robust, and fast[C]//IEEE International Conference on Robotics and Automation.Piscataway, USA:IEEE, 2015:2174-2181. [11] Kubelka V, Oswald L, Pomerleau F, et al.Robust data fusion of multimodal sensory information for mobile robots[J].Journal of Field Robotics, 2015, 32(4):447-473. [12] Moore T, Stouch D.A generalized extended Kalman filter implementation for the robot operating system[M]//Advances in Intelligent Systems and Computing, Vol.302.Berlin, Germany:Springer, 2016:335-348. [13] Jones E S, Soatto S.Visual-inertial navigation, mapping and localization:A scalable real-time causal approach[J].International Journal of Robotics Research, 2011, 30(4):407-430. [14] 刘强,段富海,桑勇,等.复杂环境下视觉SLAM闭环检测方法综述[J].机器人, 2019, 41(1):112-123,136.Liu Q, Duan F H, Sang Y, et al.A survey of loop-closure detection method of visual SLAM in complex environments[J].Robot, 2019, 41(1):112-123,136. [15] Sutoh M, Iijima Y, Sakakieda Y, et al.Motion modeling and localization of skid-steering wheeled rover on loose terrain[J].IEEE Robotics and Automation Letters, 2018, 3(4):4031-4037. [16] Simanek J, Reinstein M, Kubelka V.Evaluation of the EKFbased estimation architectures for data fusion in mobile robots[J].IEEE/ASME Transactions on Mechatronics, 2015, 20(2):985-990. [17] Chilian A, Hirschmuller H, Görner M.Multisensor data fusion for robust pose estimation of a six-legged walking robot[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Piscataway, USA:IEEE, 2011:2497-2504. [18] Shuster M D.Survey of attitude representations[J].Journal of the Astronautical Sciences, 1993, 41(4):439-517. [19] Sola J.Quaternion kinematics for the error-state Kalman filter[DB/OL].(2017-11-3)[2020-10-10].https://arxiv.org/pdf/1711.02508.pdf. [20] 郑威,彭刚.基于惯性/磁力传感器的行人3维轨迹跟踪技术[J].机器人, 2016, 38(4):444-450.Zheng W, Peng G.3D pedestrian trajectory tracking based on inertial/magnetic sensors[J].Robot, 2016, 38(4):444-450. [21] Ma J, Bajracharya M, Susca S, et al.Real-time pose estimation of a dynamic quadruped in GPS-denied environments for 24-hour operation[J].International Journal of Robotics Research, 2016, 35(6):631-653. [22] Wan G W, Yang X L, Cai R L, et al.Robust and precise vehicle localization based on multi-sensor fusion in diverse city scenes[C]//IEEE International Conference on Robotics and Automation.Piscataway, USA:IEEE, 2018:4670-4677. [23] Zhang J, Kaess M, Singh S.On degeneracy of optimizationbased state estimation problems[C]//IEEE International Conference on Robotics and Automation.Piscataway, USA:IEEE, 2016:809-816. [24] Hartley R, Jadidi M G, Grizzle J W, et al.Contact-aided invariant extended Kalman filtering for legged robot state estimation[C]//Robotics:Science and Systems.Cambridge, USA:MIT Press, 2018.DOI:10.15607/RSS.2018.XIV.050.