ZHU Qidan1, LI Ke1,2, LEI Yanmin1,3, MENG Xiangjie1
1. College of Automation, Harbin Engineering University, Harbin 150001, China; 2. Research Institute 8358 of 3rd Institute of China Aerospace Science & Industry Corp, Tianjin 300000, China; 3. School of Electronic and Information Engineering, Changchun University, Changchun 130022, China
Abstract:A method of robot homing based on panoramic vision is proposed.The omni-directional images of home position can be collected through the panoramic vision device.Then,feature points in the omni-directional image are extracted by adopting SURF(Speeded-Up Robust Feature) algorithm and are used as natural landmarks.During the course of homing, the panoramic images of home position and current position are matched using SURF algorithm,so that the corresponding relations of the natural landmarks between the current and the home images are determined.A method for estimating robot rotation angle is proposed,and fault landmark matches can be eliminated.According to changes of the landmarks' included angles between the home position and current position,the home direction can be calculated to control robot homing.When the included angle between landmarks reaches a threshold value,the robot gets to the original orientation and completes homing mission.The experiment results show that the position error is less than 0.05 m,and the azimuth error is less than 3°. This method doesn't use manual landmarks nor angle sensors,and can be applied to service robots,robot soccer and so on.
[1] 陈凤东,洪炳铬.一种基于图像的移动机器人泊位方法[J].机器人,2010,32(2):166-170.Chen F D,Hong B R.An image-based docking method for mobile robot[J].Robot,2010,32(2):166-170.
[2] Hong J W,Tan X N,Pinette B,et al.Image-based homing[J].IEEE Control Systems Magazine,1992,12(1):38-45.
[3] Cartwright B A,Collett T S.Landmark learning in bees:Experiments and models[J].Journal of Comparative Physiology,1983,151(4):521-543.
[4] R(o)fer T.Controlling a wheelchair with image-based homing[R].Manchester,UK:Manchester University,1997.
[5] Moller R.Insect visual homing strategies in a robot with analog processing[J].Biological Cybernetics,2000,83(3):231-243.
[6] Franz M O,Scholkopf B,Mallot H A,et al.Where did I take that snapshot? Scene-based homing by image matching[J].Biological Cybernetics,1998,79(3):191-202.
[7] Argyros A A,Bekris K E,Orphanoudakis S C,et al.Robot homing by exploiting panoramic vision[J].Autonomous Robots,2005,19(1):7-25.
[8] Moller R,Vardy A,Kreft S,et al.Visual homing in environments with anisotropic landmark distribution[J].Autonomous Robots,2007,23(3):231-245.
[9] Moller R.Local visual homing by warping of two-dimensional images[J].Robotics and Autonomous Systems,2009,57(1):87-101.
[10] Moller R,Krzykawski M,Gerstmayr L.Three 2D-warping schemes for visual robot navigation[J].Autonomous Robots,2010,29(3/4):253-291.
[11] Sturzl W,Mallot H A.Efficient visual homing based on Fourier transformed panoramic images[J].Robotics and Autonomous Systems,2006,54(4):300-313.
[12] Zeil J,Hofmann M I,Chahl J S.Catchment areas of panoramic snapshots in outdoor scenes[J].Journal of the Optical Society of America A:Optics,Image Science,and Vision,2003,20(3):450-469.
[13] Bay H,Ess A,Tuytelaars T,et al.Speeded-up robust features (SURF)[J].Computer Vision and Image Understanding,2008,110(3):346-359.
[14] Zhu Q D,Li K.Image stitching using simplified SIFT[C]//IEEE International Conference on Information and Automation.Piscaraway,NJ,USA:IEEE,2010:1134-1137.