张文, 杨耀鑫, 黄天帜, 孙振国. ArUco辅助的爬壁机器人自主定位方法[J]. 机器人, 2024, 46(1): 27-35, 44. DOI: 10.13973/j.cnki.robot.230046
引用本文: 张文, 杨耀鑫, 黄天帜, 孙振国. ArUco辅助的爬壁机器人自主定位方法[J]. 机器人, 2024, 46(1): 27-35, 44. DOI: 10.13973/j.cnki.robot.230046
ZHANG Wen, YANG Yaoxin, HUANG Tianzhi, SUN Zhenguo. ArUco-assisted Autonomous Localization Method for Wall Climbing Robots[J]. ROBOT, 2024, 46(1): 27-35, 44. DOI: 10.13973/j.cnki.robot.230046
Citation: ZHANG Wen, YANG Yaoxin, HUANG Tianzhi, SUN Zhenguo. ArUco-assisted Autonomous Localization Method for Wall Climbing Robots[J]. ROBOT, 2024, 46(1): 27-35, 44. DOI: 10.13973/j.cnki.robot.230046

ArUco辅助的爬壁机器人自主定位方法

ArUco-assisted Autonomous Localization Method for Wall Climbing Robots

  • 摘要: 针对现有的爬壁机器人定位技术在纹理特征不明显、环境相对封闭、存在强磁干扰等特殊环境下的不足,提出利用机载鱼眼相机观测固定于地面的ArUco码的全新定位方案并实现了基于该定位方案的惯性测量单元/编码器/鱼眼相机多传感器融合的自主定位方法A-IEF。该方法首先识别ArUco码,并根据其在鱼眼图像中的位置筛选关键帧;然后,研究了固定于地面的ArUco码的角点在鱼眼图像中的重投影规律,并结合机器人姿态约束进行重定位优化;其次,在关键帧区间内,推导了角点重投影误差关于机器人位置和姿态增量的雅可比矩阵;接着,设计了基于ES-EKF(误差状态扩展卡尔曼滤波)的多信息融合方法,以编码器估计的位移误差及ArUco角点重投影误差作为观测量,实现对机器人航向角和位置的修正;最后,在大型钢制构件上进行测试,试验结果表明,本文方法具有更高的定位精度,其中位置估计精度保持在0.06:m以内,航向角估计精度保持在3.7°以内,相较于ArUco-rectified、航位推测法等定位算法,位置误差降低约47%,航向角误差降低约68%,并能够实现在弱光照环境中的定位。

     

    Abstract: To overcome the shortcomings of the existing localization techniques for wall climbing robots in special environments, such as less texture, relatively closed, and strong magnetic interference environments, a novel localization solution by observing ground-fixed ArUco is proposed through a robot-carried fisheye camera, and an ArUco-assisted autonomous localization method (A-IEF) based on this solution with multi-sensor fusion of inertial measurement unit (IMU)/encoder/fisheye camera is implemented. The method firstly recognizes the ArUco and selects the key frames according to the ArUco position in the fisheye images. Then, the reprojection law of the ground-fixed ArUco corner in the fisheye image is studied, and the re-localization optimization is performed with the robot pose constraint. Secondly, the Jacobian matrix of the corner reprojection error with respect to the increment of robot position and attitude is derived in the key frame interval. Next, a multi-information fusion method based on error-state extended Kalman filter (ES-EKF) is designed to correct the heading angle and the position of the robot, using the displacement errors estimated by the encoder and the reprojection errors of the ArUco corners as the observation. Finally, tests are conducted on large steel components, and the experimental results show that the proposed method has higher localization accuracy, the position estimation errors are kept within 0.06 m, and the heading angle errors are kept within 3.7°. Compared with the ArUco-rectified method and the dead-reckoning method, the proposed method reduces the position error by 47% and the heading angle error by 68%, and can implement localization in weak-light environments.

     

/

返回文章
返回