动态环境下基于旋转-平移解耦的立体视觉里程计算法

Stereo Visual Odometry Algorithm with Rotation-Translation Decoupling for Dynamic Environments

  • 摘要: 在实际应用中,若图像中的动态特征数量多且运动方向一致,这些特征会对视觉里程计的估计结果产生严重的影响.本文针对这类问题提出一种根据图像特征点位置解耦估计摄像机旋转-平移的立体视觉里程计算法. 算法通过立体视觉系统将特征点划分成“远点”和“近点”.在随机抽样一致性算法(RANSAC)框架下,采用“远点”估计视觉系统的姿态;进而在姿态已知的条件下,通过“近点”估计摄像机平移, 实现旋转-平移解耦计算.这样处理可以通过姿态约束减少近距离运动物体对视觉里程计的影响.实验表明,在实际道路环境中,本文基于旋转-平移解耦估计的算法较之传统的同时估计旋转-平移的算法,能有效剔除动态特征.所提出算法对动态特征的抗干扰能力更好,鲁棒性更强,精度更高.

     

    Abstract: In practical applications, if there are large numbers of dynamic features with a same movement direction in the image, they will seriously affect the estimation results of the visual odometry. To solve the problem, a stereo visual odometry algorithm is proposed using the location of features to decouple the estimation of camera's rotation and translation. The feature points are divided into "far point" and "near point" by the stereoscopic system. Under the framework of RANSAC (random sample consensus), the "far point" is used to estimate the orientation of the visual system, then the "near point" is used to estimate the translation of the camera under the known orientation condition, which is the core of the rotation-translation decoupling algorithm. Using the algorithm, the impact of the nearby moving objects on the visual odometry is reduced by the rotation constraint. The experiment results show that the proposed rotation-translation decoupling estimation algorithm can more effectively eliminate the dynamic features in the actual road than the traditional algorithms which estimate rotation-translation simultaneously. The proposed algorithm shows its stronger anti-interference against dynamic features, which makes it more robust and accurate.

     

/

返回文章
返回