Accurate RGB-D SLAM in dynamic environments based on dynamic visual feature removal

被引:0
|
作者
Chenxin Liu
Jiahu Qin
Shuai Wang
Lei Yu
Yaonan Wang
机构
[1] University of Science and Technology of China,Department of Automation
[2] Hefei Comprehensive National Science Center,Institute of Artificial Intelligence
[3] Hunan University,College of Electrical and Information Engineering
[4] Hunan University,National Engineering Research Center of Robot Visual Perception and Control Technology
来源
关键词
SLAM; dynamic environments; indoor localization; graph-cut; robot navigation;
D O I
暂无
中图分类号
学科分类号
摘要
Visual localization is considered an essential capability in robotics and has attracted increasing interest for the past few years. However, most proposed visual localization systems assume that the surrounding environment is static, which is difficult to maintain in real-world scenarios due to the presence of moving objects. In this paper, we present DFR-SLAM, a real-time and accurate RGB-D SLAM based on ORB-SLAM2 that achieves satisfactory performance in a variety of challenging dynamic scenarios. At the core of our system lies a motion consensus filtering algorithm estimating the initial camera pose and a graph-cut optimization framework combining long-term observations, prior information, and spatial coherence to jointly distinguish dynamic and static visual features. Other systems for dynamic environments detect dynamic components by using the information from short time-span frames, whereas our system uses observations from a long period of keyframes. We evaluate our system using dynamic sequences from the public TUM dataset, and the evaluation demonstrates that the proposed system outperforms the original ORB-SLAM2 system significantly. In addition, our system provides competitive localization accuracy with satisfactory real-time performance compared to closely related SLAM systems designed to adapt to dynamic environments.
引用
收藏
相关论文
共 50 条
  • [41] RGB-D Visual SLAM Based on Yolov4-Tiny in Indoor Dynamic Environment
    Chang, Zhanyuan
    Wu, Honglin
    Sun, Yunlong
    Li, Chuanjiang
    MICROMACHINES, 2022, 13 (02)
  • [42] DRSO-SLAM: A Dynamic RGB-D SLAM Algorithm for Indoor Dynamic Scenes
    Yu, Naigong
    Gan, Mengzhe
    Yu, Hejie
    Yang, Kang
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 1052 - 1058
  • [43] A Compatible Framework for RGB-D SLAM in Dynamic Scenes
    Zhao, Lili
    Liu, Zhili
    Chen, Jianwen
    Cai, Weitong
    Wang, Wenyi
    Zeng, Liaoyuan
    IEEE ACCESS, 2019, 7 : 75604 - 75614
  • [44] DIG-SLAM: an accurate RGB-D SLAM based on instance segmentation and geometric clustering for dynamic indoor scenes
    Liang, Rongguang
    Yuan, Jie
    Kuang, Benfa
    Liu, Qiang
    Guo, Zhenyu
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (01)
  • [45] Dynamic Objects Recognizing and Masking for RGB-D SLAM
    Li, Xiangcheng
    Wu, Huaiyu
    Chen, Zhihuan
    2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021), 2021, : 169 - 174
  • [46] Dynamic RGB-D SLAM Based on Static Probability and Observation Number
    Liu, Yu
    Wu, Yilin
    Pan, Wenzhao
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70 (70)
  • [47] RGB-D Visual Odometry in Dynamic Environments Using Line Features
    Zhang H.
    Fang Z.
    Yang G.
    Jiqiren/Robot, 2019, 41 (01): : 75 - 82
  • [48] FlowFusion: Dynamic Dense RGB-D SLAM Based on Optical Flow
    Zhang, Tianwei
    Zhang, Huayan
    Li, Yang
    Nakamura, Yoshihiko
    Zhang, Lei
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 7322 - 7328
  • [49] RGB-D SLAM Method Based on Enhanced Segmentation in Dynamic Environment
    Wang H.
    Lu D.
    Fang B.
    Jiqiren/Robot, 2022, 44 (04): : 418 - 430
  • [50] MSSD-SLAM: Multifeature Semantic RGB-D Inertial SLAM With Structural Regularity for Dynamic Environments
    Wang, Yanan
    Tian, Yaobin
    Chen, Jiawei
    Chen, Cheng
    Xu, Kun
    Ding, Xilun
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74