Abstract:
In practical applications, if there are large numbers of dynamic features with a same movement direction in the image, they will seriously affect the estimation results of the visual odometry. To solve the problem, a stereo visual odometry algorithm is proposed using the location of features to decouple the estimation of camera's rotation and translation. The feature points are divided into "far point" and "near point" by the stereoscopic system. Under the framework of RANSAC (random sample consensus), the "far point" is used to estimate the orientation of the visual system, then the "near point" is used to estimate the translation of the camera under the known orientation condition, which is the core of the rotation-translation decoupling algorithm. Using the algorithm, the impact of the nearby moving objects on the visual odometry is reduced by the rotation constraint. The experiment results show that the proposed rotation-translation decoupling estimation algorithm can more effectively eliminate the dynamic features in the actual road than the traditional algorithms which estimate rotation-translation simultaneously. The proposed algorithm shows its stronger anti-interference against dynamic features, which makes it more robust and accurate.