RGB-D Visual Odometry in Dynamic Environments Using Line Features

被引:0
|
作者
Zhang H. [1 ,2 ,3 ]
Fang Z. [2 ,3 ]
Yang G. [2 ,3 ]
机构
[1] University of Chinese Academy of Sciences, Beijing
[2] Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences, Ningbo
[3] Zhejiang Key Laboratory of Robotics and Intelligent Manufacturing Equipment Technology, Ningbo
来源
Jiqiren/Robot | 2019年 / 41卷 / 01期
关键词
Dynamic environment; Line feature; RGB-depth; Simultaneous localization and mapping; Visual odometry;
D O I
10.13973/j.cnki.robot.180020
中图分类号
学科分类号
摘要
Most of RGB-D SLAM (simultaneous localization and mapping) methods assume that the environments are static. However, there are often dynamic objects in real world environments, which can degrade the SLAM performance. In order to solve this problem, a line feature-based RGB-D (RGB-depth) visual odometry is proposed. It calculates static weights of line features to filter out dynamic line features, and uses the rest of line features to estimate the camera pose. The proposed method not only reduces the influence of dynamic objects, but also avoids the tracking failure caused by few point features. The experiments are carried out on a public dataset. Compared with state-of-the-art methods like ORB (oriented FAST and rotated BRIEF) method, the results demonstrate that the proposed method reduces the tracking error by about 30% and improves the accuracy and robustness of visual odometry in dynamic environments. © 2019, Science Press. All right reserved.
引用
收藏
页码:75 / 82
页数:7
相关论文
共 32 条
  • [1] Huang A.S., Bachrach A., Henry P., Et al., Visual odometry and mapping for autonomous flight using an RGB-D camera, 15th International Symposium of Robotics Research, pp. 235-252, (2017)
  • [2] Wang F., Cui J.Q., Chen B.M., Et al., A comprehensive UAV indoor navigation system based on vision optical flow and laser Fast-SLAM, Acta Automatica Sinica, 39, 11, pp. 1889-1900, (2013)
  • [3] Schops T., Engel J., Cremers D., Semi-dense visual odometry for AR on a smartphone, IEEE International Symposium on Mixed and Augmented Reality, pp. 145-150, (2014)
  • [4] Hane C., Heng L., Lee G.H., Et al., 3D visual perception for selfdriving cars using a multi-camera system: Calibration, mapping, localization, and obstacle detection, Image and Vision Computing, 68, pp. 14-27, (2017)
  • [5] Kerl C., Sturm J., Cremers D., Robust odometry estimation for RGB-D cameras, IEEE International Conference on Robotics and Automation, pp. 3748-3754, (2013)
  • [6] Whelan T., Johannsson H., Kaess M., Et al., Robust real-time visual odometry for dense RGB-D mapping, IEEE International Conference on Robotics and Automation, pp. 5724-5731, (2013)
  • [7] Kerl C., Sturm J., Cremers D., Dense visual SLAM for RGB-D cameras, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2100-2106, (2013)
  • [8] Gutierrez-Gomez D., Mayol-Cuevas W., Guerrero J.J., Dense RGB-D visual odometry using inverse depth, Robotics and Autonomous Systems, 75, pp. 571-583, (2016)
  • [9] Jaimez M., Kerl C., Gonzalez-Jimenez J., Et al., Fast odometry and scene flow from RGB-D cameras based on geometric clustering, IEEE International Conference on Robotics and Automation, pp. 3992-3999, (2017)
  • [10] Fu M.Y., Lu X.W., Liu T., Et al., Real-time SLAM algorithm based on RGB-D data, Robot, 37, 6, pp. 683-692, (2015)