KO-Fusion: Dense Visual SLAM with Tightly-Coupled Kinematic and Odometric Tracking

被引:0
|
作者
Houscago, Charlie [1 ]
Bloesch, Michael [1 ]
Leutenegger, Stefan [1 ]
机构
[1] Imperial Coll London, Dept Comp, Imperial Coll, Dyson Robot Lab, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/icra.2019.8793471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dense visual SLAM methods are able to estimate the 3D structure of an environment and locate the observer within them. They estimate the motion of a camera by matching visual information between consecutive frames, and are thus prone to failure under extreme motion conditions or when observing texture-poor regions. The integration of additional sensor modalities has shown great promise in improving the robustness and accuracy of such SLAM systems. In contrast to the popular use of inertial measurements we propose to tightly-couple a dense RGB-D SLAM system with kinematic and odometry measurements from a wheeled robot equipped with a manipulator. The system has real-time capability while running on GPU. It optimizes the camera pose by considering the geometric alignment of the map as well as kinematic and odometric data from the robot. Through experimentation in the real-world, we show that the system is more robust to challenging trajectories featuring fast and loopy motion than the equivalent system without the additional kinematic and odometric knowledge, whilst retaining comparable performance to the equivalent RGB-D only system on easy trajectories.
引用
收藏
页码:4054 / 4060
页数:7
相关论文
共 50 条
  • [31] Visual-inertial odometry based on tightly-coupled encoder
    Hu, Zhangfang
    Guo, Zhenqian
    Luo, Yuan
    Chen, Jian
    OPTOELECTRONIC IMAGING AND MULTIMEDIA TECHNOLOGY IX, 2022, 12317
  • [32] TC2LI-SLAM: A Tightly-Coupled Camera-LiDAR-Inertial SLAM System
    Tong, Yunze
    Zhang, Xuebo
    Wang, Runhua
    Song, Zhixing
    Wu, Songyang
    Zhang, Shiyong
    Wang, Youwei
    Yuan, Jing
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7421 - 7428
  • [33] BodySLAM plus plus : Fast and Tightly-Coupled Visual-Inertial Camera and Human Motion Tracking
    Henning, Dorian F.
    Choi, Christopher
    Schaefer, Simon
    Leutenegger, Stefan
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 3781 - 3788
  • [34] LVIO-Fusion:Tightly-Coupled LiDAR-Visual-Inertial Odometry and Mapping in Degenerate Environments
    Zhang, Hongkai
    Du, Liang
    Bao, Sheng
    Yuan, Jianjun
    Ma, Shugen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04) : 3783 - 3790
  • [35] Tightly-coupled Fusion of Global Positional Measurements in Optimization-based Visual-Inertial Odometry
    Cioffi, Giovanni
    Scaramuzza, Davide
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 5089 - 5095
  • [36] VINS-Motion: Tightly-coupled Fusion of VINS and Motion Constraint
    Yu, Zhelin
    Zhu, Lidong
    Lu, Guoyu
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 7672 - 7678
  • [37] Analysis of SINS/GPS Tightly-coupled Receiver Carrier Tracking Loop
    Lv, Peng
    Lu, Mingquan
    Feng, Zhenming
    CSNC 2011: 2ND CHINA SATELLITE NAVIGATION CONFERENCE, VOLS 1-3, 2011, : 1443 - 1446
  • [38] Tightly-Coupled SLAM Integrating LiDAR and INS for Unmanned Vehicle Navigation in Campus Environments
    Zhang, Linshuai
    Wang, Qian
    Gu, Shuoxin
    Jiang, Tao
    Jiang, Shiqi
    Liu, Jiajia
    Luo, Shuang
    Yan, Gongjun
    IEEE ACCESS, 2024, 12 : 26441 - 26456
  • [39] In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense Retrieval
    Lin, Sheng-Chieh
    Yang, Jheng-Hong
    Lin, Jimmy
    REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 163 - 173
  • [40] Tightly-Coupled Visual-Inertial Localization and 3-D Rigid-Body Target Tracking
    Eckenhoff, Kevin
    Yang, Yulin
    Geneva, Patrick
    Huang, Guoquan
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (02) : 1541 - 1548