RoDyn-SLAM: Robust Dynamic Dense RGB-D SLAM With Neural Radiance Fields

被引:2
|
作者
Jiang, Haochen [1 ]
Xu, Yueming [2 ]
Li, Kejie [3 ]
Feng, Jianfeng [2 ]
Zhang, Li [1 ]
机构
[1] Fudan Univ, Sch Data Sci, Shanghai 200433, Peoples R China
[2] Fudan Univ, Inst Sci & Technol Brain Inspired Intelligence, Shanghai 200433, Peoples R China
[3] ByteDance, Seattle, WA USA
来源
基金
上海市自然科学基金; 中国国家自然科学基金; 国家重点研发计划;
关键词
Simultaneous localization and mapping; Dynamics; Pose estimation; Cameras; Robustness; Optimization; Geometry; Deep learning methods; dynamic scene; NeRF; pose estimation; RGB-D SLAM; TRACKING;
D O I
10.1109/LRA.2024.3427554
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Leveraging neural implicit representation to conduct dense RGB-D SLAM has been studied in recent years. However, this approach relies on a static environment assumption and does not work robustly within a dynamic environment due to the inconsistent observation of geometry and photometry. To address the challenges presented in dynamic environments, we propose a novel dynamic SLAM framework with neural radiance field. Specifically, we introduce a motion mask generation method to filter out the invalid sampled rays. This design effectively fuses the optical flow mask and semantic mask to enhance the precision of motion mask. To further improve the accuracy of pose estimation, we have designed a divide-and-conquer pose optimization algorithm that distinguishes between keyframes and non-keyframes. The proposed edge warp loss can effectively enhance the geometry constraints between adjacent frames. Extensive experiments are conducted on the two challenging datasets, and the results show that RoDyn-SLAM achieves state-of-the-art performance among recent neural RGB-D methods in both accuracy and robustness. Our implementation of the Rodyn-SLAM will be open-sourced to benefit the community.
引用
收藏
页码:7509 / 7516
页数:8
相关论文
共 50 条
  • [1] Robust and Efficient RGB-D SLAM in Dynamic Environments
    Yang, Xin
    Yuan, Zikang
    Zhu, Dongfu
    Chi, Cheng
    Li, Kun
    Liao, Chunyuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 4208 - 4219
  • [2] A robust RGB-D SLAM algorithm
    Hu, Gibson
    Huang, Shoudong
    Zhao, Liang
    Alempijevic, Alen
    Dissanayake, Gamini
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 1714 - 1719
  • [3] PoseFusion: Dense RGB-D SLAM in Dynamic Human Environments
    Zhang, Tianwei
    Nakamura, Yoshihiko
    PROCEEDINGS OF THE 2018 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, 2020, 11 : 772 - 780
  • [4] Dense RGB-D SLAM with Multiple Cameras
    Meng, Xinrui
    Gao, Wei
    Hu, Zhanyi
    SENSORS, 2018, 18 (07)
  • [5] Dense Visual SLAM for RGB-D Cameras
    Kerl, Christian
    Sturm, Juergen
    Cremers, Daniel
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 2100 - 2106
  • [6] Robust RGB-D SLAM in Dynamic Environments for Autonomous Vehicles
    Ji, Tete
    Yuan, Shenghai
    Xie, Lihua
    2022 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2022, : 665 - 671
  • [7] Towards Dense Moving Object Segmentation based Robust Dense RGB-D SLAM in Dynamic Scenarios
    Wang, Youbing
    Huang, Shoudong
    2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 1841 - 1846
  • [8] StaticFusion: Background Reconstruction for Dense RGB-D SLAM in Dynamic Environments
    Scona, Raluca
    Jaimez, Mariano
    Petillot, Yvan R.
    Fallon, Maurice
    Cremers, Daniel
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 3849 - 3856
  • [9] Speed and Memory Efficient Dense RGB-D SLAM in Dynamic Scenes
    Canovas, Bruce
    Rombaut, Michele
    Negre, Amaury
    Pellerin, Denis
    Olympieff, Serge
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 4996 - 5001
  • [10] Dense RGB-D SLAM for Humanoid Robots in the Dynamic Humans Environment
    Zhang, Tianwei
    Uchiyama, Emiko
    Nakamura, Yoshihiko
    2018 IEEE-RAS 18TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2018, : 732 - 738