Taking the feature distribution into account, a local feature extraction algorithm is proposed. The local features satisfying isotropic distribution in the unstructured environment are extracted as natural landmarks. Thus the behavior-based robot can achieve high-precision visual homing by utilizing those landmarks. Based on the SIFT (scale invariant feature transform) algorithm, the UD-SIFT (uniform distribution-SIFT) algorithm is obtained by improving the uniformity of the feature distribution. In addition, a novel criterion for evaluating the uniformity is proposed. The visual homing experiments are carried out indoors, in the corridor and outdoors, using the ADV (average displacement vector) and ALV (average landmark vector) methods which are both based on the panoramic vision. Compared with the original SIFT, the UD-SIFT lowers the homing average angular error by more than 25.01%. The results show that this algorithm effectively improves the feature distribution and the robot homing precision.
[1] Moller R. Do insects use templates or parameters for landmark navigation[J]. Journal of Theoretical Biology, 2001, 210(1): 33-45. [2] Cartwright B A, Collett T S. Landmark learning in bees[J]. Journal of Comparative Physiology, 1983, 151(4): 521-543. [3] Hong J W, Tan X N, Pinette B, et al. Image-based homing[J]. IEEE Control Systems Magazine, 1992, 12(1): 38-45. [4] Franz M O, Scholkopf B, Mallot H A, et al. Where did I take that snapshot? Scene-based homing by image matching[J]. Biological Cybernetics, 1998, 79(3): 191-202. [5] Lambrinos D, Moller R, Labhart T, et al. A mobile robot employing insect strategies for navigation[J]. Robotics and Autonomous Systems, 2000, 30(1): 39-64.[6] Moller R. Insect visual homing strategies in a robot with analog processing[J]. Biological Cybernetics, 2000, 83(3): 231-243. [7] DeSouza G N, Kak A C. Vision for mobile robot navigation: A survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(2): 237-267. [8] Argyros A A, Bekris K E, Orhpanoudakis S C, et al. Robot homing by exploiting panoramic vision[J]. Autonomous Robots, 2005, 19(1): 7-25. [9] Lowe D G. Distinctive image features from scale-invariant keypoints[J]. International Journal of Computer Vision, 2004, 60(2): 91-110. [10] Kirigin I, Singh S. Bearings based robot homing with robust landmark matching and limited horizon view[R]. Pittsburgh, USA: Carnegie Mellon University, 2005.[11] Churchill D, Vardy A. Homing in scale space[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA: IEEE, 2008: 1307-1312.[12] Liu M, Pradalier C, Pomerleau F, et al. The role of homing in visual topological navigation[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA: IEEE, 2012: 567-572.[13] Ramisa A, Goldhoorn A, Aldavert D, et al. Combining invariant features and the ALV homing method for autonomous robot navigation based on panoramas[J]. Journal of Intelligent & Robotic Systems, 2011, 64(3/4): 625-649.[14] Yu S E, Lee C, Kim D E. Analyzing the effect of landmark vectors in homing navigation[J]. Adaptive Behavior, 2012, 20(5): 337-359. [15] 李科.移动机器人全景视觉归航技术研究[D].哈尔滨:哈尔滨工程大学,2011. Li K. Mobile robot homing based on panoramic vision[D]. Harbin: Harbin Engineering University, 2011.[16] Song R, Szymanski J. Well-distributed SIFT features[J]. Electronics Letters, 2009, 45(6): 308-310. [17] Lingua A, Marenchino D, Nex F. Performance analysis of the SIFT operator for automatic feature extraction and matching in photogrammetric applications[J]. Sensors, 2009, 9(5): 3745-3766. [18] Sedaghat A, Mokhtarzade M, Ebadi H. Uniform robust scale-invariant feature matching for optical remote sensing images[J]. IEEE Transactions on Geoscience and Remote Sensing, 2011, 49(11): 4516-4527. [19] 朱齐丹,李科,雷艳敏,等.基于全景视觉的机器人回航方法[J].机器人,2011,33(5): 606-613. Zhu Q D, Li K, Lei Y M, et al. Robot homing based on panoramic vision[J]. Robot, 2011, 33(5): 606-613.