一种基于拟全方位视觉的移动机器人自运动融合估计方法

A Fusion Estimation Method Based on Pseudo-Omnidirectional Vision System for Mobile Robot Self-Motion

  • 摘要: 基于地平面假设的移动机器人单路视觉运动估计存在鲁棒性和环境适应性较差、精度较低等缺点,针对这一问题,本文首先介绍了拟全方位视觉系统的构成,并结合该视觉系统的特点给出了一种基于两步运动的摄像头平行位姿参数标定方法.然后据此提出了一种基于拟全方位视觉的自主移动机器人自运动融合估计方法.该方法能够借助机器人的非完整运动约束、地平面运行假设以及运动估计参数之间的相容性测度等多种因素,对拟全方位视觉系统中的各路视觉估计进行性能综合评价;最终依据评价结果融合确定出具有较高可信度和较强鲁棒性的运动估计参数.实验结果从鲁棒性、精度以及实时性等方面验证了本算法的有效性.

     

    Abstract: In order to solve the problems of poor robustness,bad adaptability and low precision in motion estimation for mobile robot by using single camera based on ?at ground assumption,a pseudo-omnidirectional vision system and its components are introduced firstly,and a novel calibration method for ground-parallel parameters of camera based on two-step movement of robot is presented considering the characteristics of the vision system.Then a fusion estimation algorithm based on pseudo-omnidirectional vision for autonomous mobile robot self-motion is proposed.According to such factors as robot nonholonomic constraints,?at ground assumption and consistence measure among motion estimation parameters,the algorithm can synthetically evaluate the performance of each visual estimation in the pseudo-omnidirectional vision system.And based on the estimation result,the motion estimation parameters that are more reliable and more robust can be fused and determined finally.The experimental results validate the proposed method in the aspects of robustness,precision and real-time property.

     

/

返回文章
返回