KO-Fusion: Dense Visual SLAM with Tightly-Coupled Kinematic and Odometric Tracking

被引:0
|
作者
Houscago, Charlie [1 ]
Bloesch, Michael [1 ]
Leutenegger, Stefan [1 ]
机构
[1] Imperial Coll London, Dept Comp, Imperial Coll, Dyson Robot Lab, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/icra.2019.8793471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dense visual SLAM methods are able to estimate the 3D structure of an environment and locate the observer within them. They estimate the motion of a camera by matching visual information between consecutive frames, and are thus prone to failure under extreme motion conditions or when observing texture-poor regions. The integration of additional sensor modalities has shown great promise in improving the robustness and accuracy of such SLAM systems. In contrast to the popular use of inertial measurements we propose to tightly-couple a dense RGB-D SLAM system with kinematic and odometry measurements from a wheeled robot equipped with a manipulator. The system has real-time capability while running on GPU. It optimizes the camera pose by considering the geometric alignment of the map as well as kinematic and odometric data from the robot. Through experimentation in the real-world, we show that the system is more robust to challenging trajectories featuring fast and loopy motion than the equivalent system without the additional kinematic and odometric knowledge, whilst retaining comparable performance to the equivalent RGB-D only system on easy trajectories.
引用
收藏
页码:4054 / 4060
页数:7
相关论文
共 50 条
  • [11] PLI-SLAM: A Tightly-Coupled Stereo Visual-Inertial SLAM System with Point and Line Features
    Teng, Zhaoyu
    Han, Bin
    Cao, Jie
    Hao, Qun
    Tang, Xin
    Li, Zhaoyang
    REMOTE SENSING, 2023, 15 (19)
  • [12] A Visual SLAM With Tightly Coupled Integration of Multiobject Tracking for Production Workshop
    Gou, Rongsong
    Chen, Guangzhu
    Pu, Xin
    Liao, Xiaojuan
    Chen, Runji
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 19949 - 19962
  • [13] Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations
    Thien Hoang Nguyen
    Thien-Minh Nguyen
    Xie, Lihua
    AUTONOMOUS ROBOTS, 2020, 44 (08) : 1519 - 1534
  • [14] Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations
    Thien Hoang Nguyen
    Thien-Minh Nguyen
    Lihua Xie
    Autonomous Robots, 2020, 44 : 1519 - 1534
  • [15] Tightly-Coupled Visual-DVL Fusion For Accurate Localization of Underwater Robots
    Huang, Yupei
    Li, Peng
    Yan, Shuaizheng
    Ou, Yaming
    Wu, Zhengxing
    Tan, Min
    Yu, Junzhi
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 8090 - 8095
  • [16] Detection-first tightly-coupled LiDAR-Visual-Inertial SLAM in dynamic environments
    Xu, Xiaobin
    Hu, Jinchao
    Zhang, Lei
    Cao, Chenfei
    Yang, Jian
    Ran, Yingying
    Tan, Zhiying
    Xu, Linsen
    Luo, Minzhou
    MEASUREMENT, 2025, 239
  • [17] GVIL: Tightly-Coupled GNSS PPP/Visual/INS/LiDAR SLAM Based on Graph Optimization
    Liao J.
    Li X.
    Feng S.
    Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University, 2023, 48 (07): : 1204 - 1215
  • [18] IMM-SLAMMOT: Tightly-Coupled SLAM and IMM-Based Multi-Object Tracking
    Ying, Zhuoye
    Li, Hao
    IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2024, 9 (02): : 3964 - 3974
  • [19] A Visual-Inertial Dynamic Object Tracking SLAM Tightly Coupled System
    Zhang, Hanxuan
    Wang, Dingyi
    Huo, Ju
    IEEE SENSORS JOURNAL, 2023, 23 (17) : 19905 - 19917
  • [20] P3-VINS: Tightly-Coupled PPP/INS/Visual SLAM Based on Optimization Approach
    Li, Tao
    Pei, Ling
    Xiang, Yan
    Yu, Wenxian
    Truong, Trieu-Kien
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03): : 7021 - 7027