王泽华, 梁冬泰, 梁丹, 章家成, 刘华杰. 基于惯性/磁力传感器与单目视觉融合的SLAM方法[J]. 机器人, 2018, 40(6): 933-941. DOI: 10.13973/j.cnki.robot.170683
引用本文: 王泽华, 梁冬泰, 梁丹, 章家成, 刘华杰. 基于惯性/磁力传感器与单目视觉融合的SLAM方法[J]. 机器人, 2018, 40(6): 933-941. DOI: 10.13973/j.cnki.robot.170683
WANG Zehua, LIANG Dongtai, LIANG Dan, ZHANG Jiacheng, LIU Huajie. A SLAM Method Based on Inertial/Magnetic Sensors and Monocular Vision Fusion[J]. ROBOT, 2018, 40(6): 933-941. DOI: 10.13973/j.cnki.robot.170683
Citation: WANG Zehua, LIANG Dongtai, LIANG Dan, ZHANG Jiacheng, LIU Huajie. A SLAM Method Based on Inertial/Magnetic Sensors and Monocular Vision Fusion[J]. ROBOT, 2018, 40(6): 933-941. DOI: 10.13973/j.cnki.robot.170683

基于惯性/磁力传感器与单目视觉融合的SLAM方法

A SLAM Method Based on Inertial/Magnetic Sensors and Monocular Vision Fusion

  • 摘要: 针对单目视觉SLAM(同时定位与地图构建)算法没有尺度信息以及在相机移动过快时无法使用的问题,提出了一种IMU(惯性测量单元)!!/磁力传感器与单目视觉融合的SLAM方法.首先,提出了一种模糊自适应的九轴姿态融合算法,对IMU的航向角进行高精度估计.然后,采用单目ORB-SLAM2(oriented FAST and rotated BRIEF SLAM2)算法,通过IMU估计其尺度因子,并对其输出的位姿信息进行尺度转换.最后,采用松耦合方式,对IMU估计的位姿和ORB-SLAM2算法经过尺度转换后的位姿,进行卡尔曼滤波融合.在公开数据集EuRoC上进行了测试,测试结果表明本文方法总的位置均方根误差为5.73 cm.为了进一步在实际环境中验证,设计了全向移动平台,以平台上激光雷达所测的位姿数据为基准,测试结果表明本文方法的旋转角度误差小于5°,总的位置均方根误差为9.76 cm.

     

    Abstract: To solve the problem that the monocular vision SLAM (simultaneous localization and mapping) algorithm lacks the scale information and can't be used when the camera moves too fast, a SLAM method based on IMU (inertial measurement unit)/magnetic sensor and monocular vision fusion is proposed. Firstly, a fuzzy adaptive 9-axis attitude fusion algorithm is proposed for estimating the heading angle of IMU accurately. Then, a monocular ORB-SLAM2 (oriented FAST and rotated BRIEF SLAM2) algorithm is used, its scale factor is calculated by IMU, and scale transformation of its output pose information is performed. Finally, the loosely coupled method is adopted to fuse the poses of IMU and ORB-SLAM2 algorithm after scale transformation by Kalman filter. The experiments are carried out on a public dataset EuRoC, and the results show that the total root mean square error of the method is 5.73 cm. For further verification in the actual environment, an omnidirectional mobile platform is designed, and the test results show that the rotation angle error of the method is less than 5° and the total position RMSE is 9.76 cm compared with the pose data measurements from the lidar.

     

/

返回文章
返回