Binocular Vision SLAM Based on Event Camera
-
Graphical Abstract
-
Abstract
Most of the existing simultaneous localization and mapping(SLAM) methods based on event camera rely on the accumulation of event streams for a long time, and cannot effectively uses the correlation between event streams, resulting in poor mapping and positioning accuracy. For this problem, a binocular vision SLAM system based on an event camera is explored in this paper. By matching the nearest timestamp events on the left and right time surfaces, the feature depth information is calculated, and the depth information and time surface information are matched by Luacas-Kanade algorithm to calculate the small increment of pose to achieve tracking. Then, the IRLS(iterative reweighted least squares) algorithm is used to accelerate the fusion of depth information based on the Students’ t probability model, to obtain a semi-dense map.The open source dataset is used to simulate the extreme environment to test the performance of the algorithm, and the error analysis on the mapping and tracking results is carried out. The experimental results show that compared with the state-ofthe-art methods, the proposed method has better mapping effect, and the pose error on different datasets is reduced by 48.3%,demonstrating higher positioning accuracy. Meanwhile, it can work robustly in low light and high dynamic scenes.
-
-