Direct Depth SLAM: Sparse Geometric Feature Enhanced Direct Depth SLAM System for Low-Texture Environments

被引:13
|
作者
Zhao, Shibo [1 ]
Fang, Zheng [1 ]
机构
[1] Northeastern Univ, Fac Robot Sci & Engn, Shenyang 110819, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
SLAM; depth vision; sparse geometric features; pose graph; VISUAL ODOMETRY; ROBUST; VERSATILE;
D O I
10.3390/s18103339
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
This paper presents a real-time, robust and low-drift depth-only SLAM (simultaneous localization and mapping) method for depth cameras by utilizing both dense range flow and sparse geometry features from sequential depth images. The proposed method is mainly composed of three optimization layers, namely Direct Depth layer, ICP (Iterative closest point) Refined layer and Graph Optimization layer. The Direct Depth layer uses a range flow constraint equation to solve the fast 6-DOF (six degrees of freedom) frame-to-frame pose estimation problem. Then, the ICP Refined layer is used to reduce the local drift by applying local map based motion estimation strategy. After that, we propose a loop closure detection algorithm by extracting and matching sparse geometric features and construct a pose graph for the purpose of global pose optimization. We evaluate the performance of our method using benchmark datasets and real scene data. Experiment results show that our front-end algorithm clearly over performs the classic methods and our back-end algorithm is robust to find loop closures and reduce the global drift.
引用
收藏
页数:21
相关论文
共 23 条
  • [1] DVL-SLAM: sparse depth enhanced direct visual-LiDAR SLAM
    Young-Sik Shin
    Yeong Sang Park
    Ayoung Kim
    Autonomous Robots, 2020, 44 : 115 - 130
  • [2] DVL-SLAM: sparse depth enhanced direct visual-LiDAR SLAM
    Shin, Young-Sik
    Park, Yeong Sang
    Kim, Ayoung
    AUTONOMOUS ROBOTS, 2020, 44 (02) : 115 - 130
  • [3] Semi-Direct SLAM with Manhattan for Indoor Low-Texture Environment
    Zheng, Zhiwen
    Zhang, Qi
    Wang, He
    Li, Ru
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT III, 2024, 14427 : 350 - 362
  • [4] Direct Visual SLAM using Sparse Depth for Camera-LiDAR System
    Shin, Young-Sik
    Park, Yeong Sang
    Kim, Ayoung
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 5144 - 5151
  • [5] Sparse Depth Enhanced Direct Thermal-Infrared SLAM Beyond the Visible Spectrum
    Shin, Young-Sik
    Kim, Ayoung
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (03) : 2918 - 2925
  • [6] Pop-up SLAM: Semantic Monocular Plane SLAM for Low-texture Environments
    Yang, Shichao
    Song, Yu
    Kaess, Michael
    Scherer, Sebastian
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 1222 - 1229
  • [7] A robust visual SLAM system for low-texture and semi-static environments
    He, Bin
    Xu, Sixiong
    Dong, Yanchao
    Wang, Senbo
    Yue, Jiguang
    Ji, Lingling
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 83 (22) : 61559 - 61583
  • [8] Post-integration based point-line feature visual SLAM in low-texture environments
    Yanli Liu
    Zhengyuan Feng
    Heng Zhang
    Wang Dong
    Scientific Reports, 15 (1)
  • [9] Geometric Primitives Based RGB-D SLAM for Low-texture Environment
    Ji, Penglei
    Zeng, Ming
    Liu, Xinguo
    PROCEEDINGS OF THE 31ST INTERNATIONAL CONFERENCE ON COMPUTER ANIMATION AND SOCIAL AGENTS (CASA 2016), 2015, : 59 - 65
  • [10] Feature-enhanced visual SLAM algorithm based on the sparse direct method
    Ye J.
    You R.
    Yu M.
    Zhu L.
    Yu S.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2023, 44 (06): : 205 - 212