Abstract:An improved loop closure detection algorithm based on constraint of space position uncertainty is proposed for the loop closure detection problem in visual simultaneous localization and mapping (VSLAM) of mobile robots in perceptual aliasing scene. First of all, a new distance function is put forward for ICP (iterative closest point) algorithm to cover the shortages of Euclid distance and Mahalanobis distance in point cloud registration. Then a cumulative error model of visual odometry is established based on the space position uncertainty of feature points, and the error is decreased by Kalman filter. Next, a space range constraint for loop closure detection is given by the cumulative error model of visual odometry. Finally, the cumulative error is corrected according to the results of loop closure detection, thus the range of loop closure detection is reduced. On one hand, the improved loop closure detection algorithm proposed is of better real-time performance as a result of the range limit in loop closure detection. On the other hand, the precision ratio of loop closure detection is enhanced because most perceptual aliasing scenes are eliminated by the spatial limit. Both of the contrast experiments based on the datasets and the actual scene show that, in perceptual aliasing scenes, the improved loop closure detection algorithm proposed has a better precision ratio in a condition of high recall when compared with IAB-MAP, FAB-MAP and RTAB-MAP, and has a good real-time performance as well. In a complicated scene indoors, it also obtains good real-time performance and high accuracy.
[1] Angeli A, Doncieux S, Meyer J-A, et al. Real-time visual loop-closure detection[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2008: 1842-1847.
[2] Audras C, Comport A, Meilland M, et al. Real-time dense RGB-D localisation and mapping[C]//Australian Conference on Robotics and Automation. 2011.
[3] Baeza-Yates R, Ribeiro-Neto B. Modern information retrieval [M]. New York, USA: ACM Press, 1999.
[4] Cummins M, Newman P. Probabilistic appearance based navigation and loop closing[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2007: 2042-2048.
[5] Cummins M, Newman P. Highly scalable appearance-only SLAM-FAB-MAP 2.0[C]//Robotics: Science and Systems. 2009. DOI: 10.15607/RSS.2009.V.039.
[6] Angeli A, Filliat D, Doncieux S, et al. Fast and incremental method for loop-closure detection using bags of visual words[J]. IEEE Transactions on Robotics, 2008, 24(5): 1027-1037.
[7] Labbe M, Michaud F. Memory management for real-time appearance-based loop closure detection[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA: IEEE, 2011: 1271-1276.
[8] Labbe M, Michaud F. Appearance-based loop closure detection for online large-scale and long-term operation[J]. IEEE Transactions on Robotics, 2013, 29(3): 734-745.
[9] 梁明杰,闵华清,罗荣华.基于图优化的同时定位与地图 创建综述[J].机器人,2013,35(4):500-512. Liang M J, Ming H Q, Luo R H. Graph-based SLAM: A survey[ J]. Robot, 2013, 35(4): 500-512.
[10] Ivan Dryanovski, Roberto G. Valenti, Jizhong Xiao. Fast visual odometry and mapping from RGB-D data[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2013: 2305-2310.
[11] Bay H, Ess A, Tuytelaars T, et al. Speeded-up robust features (SURF)[J]. Computer Vision and Image Understanding, 2008, 110(3): 346-359.
[12] 百度百科. 马氏距离[DB/OL]. [2015-12-27]. http://wapbaike. baidu.com/view/1236162.htm.
[13] Estrada C, Neira J, Tardos J D. Hierarchical SLAM: Real-time accurate mapping of large environments[J]. IEEE Transactions on Robotics, 2005, 21(4): 588-596.
[14] Grisetti G, Stachniss C, Burgard W. Improved techniques for grid mapping with Rao-Blackwellized particle filters[J]. IEEE Transactions on Robotics, 2007, 23(1): 34-46.
[15] 李博,杨丹,邓林.移动机器人闭环检测的视觉字典树金 字塔TF-IDF 得分匹配方法[J].自动化学报,2011,37(6): 665-673. Li B, Yang D, Deng L. Visual vocabulary tree with pyramid TFIDF scoring match scheme for loop closure detection[J]. Acta Automatica Sinica, 2011, 37(6): 665-673.
[16] Nister D. An efficient solution to the five-point relative pose problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(6): 756-770.
[17] Sturm J, Engelhard N, Endres F, et al. A benchmark for the evaluation of RGB-D SLAM systems[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA: IEEE, 2012: 573-580.