KO-Fusion: Dense Visual SLAM with Tightly-Coupled Kinematic and Odometric Tracking

被引:0
|
作者
Houscago, Charlie [1 ]
Bloesch, Michael [1 ]
Leutenegger, Stefan [1 ]
机构
[1] Imperial Coll London, Dept Comp, Imperial Coll, Dyson Robot Lab, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/icra.2019.8793471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dense visual SLAM methods are able to estimate the 3D structure of an environment and locate the observer within them. They estimate the motion of a camera by matching visual information between consecutive frames, and are thus prone to failure under extreme motion conditions or when observing texture-poor regions. The integration of additional sensor modalities has shown great promise in improving the robustness and accuracy of such SLAM systems. In contrast to the popular use of inertial measurements we propose to tightly-couple a dense RGB-D SLAM system with kinematic and odometry measurements from a wheeled robot equipped with a manipulator. The system has real-time capability while running on GPU. It optimizes the camera pose by considering the geometric alignment of the map as well as kinematic and odometric data from the robot. Through experimentation in the real-world, we show that the system is more robust to challenging trajectories featuring fast and loopy motion than the equivalent system without the additional kinematic and odometric knowledge, whilst retaining comparable performance to the equivalent RGB-D only system on easy trajectories.
引用
收藏
页码:4054 / 4060
页数:7
相关论文
共 50 条
  • [1] Tightly-Coupled Monocular Visual-Odometric SLAM Using Wheels and a MEMS Gyroscope
    Quan, Meixiang
    Piao, Songhao
    Tan, Minglang
    Huang, Shi-Sheng
    IEEE ACCESS, 2019, 7 : 97374 - 97389
  • [2] Tightly-coupled fusion of iGPS measurements in optimization-based visual SLAM
    Yang, Ze
    Li, Yanyan
    Lin, Jiarui
    Sun, Yanbiao
    Zhu, Jigui
    OPTICS EXPRESS, 2023, 31 (04) : 5910 - 5926
  • [3] Visual-Inertial SLAM with Tightly-Coupled Dropout-Tolerant GPS Fusion
    Boche, Simon
    Zuo, Xingxing
    Schaefer, Simon
    Leutenegger, Stefan
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 7020 - 7027
  • [4] Efficient and Accurate Tightly-Coupled Visual-Lidar SLAM
    Chou, Chih-Chung
    Chou, Cheng-Fu
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (09) : 14509 - 14523
  • [5] Tightly-coupled SLAM algorithm integrating LiDAR/IMU/vehicle kinematic constraints
    Yang, Xiujian
    Yan, Shaoxiang
    Huang, Jialong
    Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2024, 32 (06): : 547 - 554
  • [6] DynaSLAM II: Tightly-Coupled Multi-Object Tracking and SLAM
    Bescos, Berta
    Campos, Carlos
    Tardos, Juan D.
    Neira, Jose
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (03) : 5191 - 5198
  • [7] Sensor Synchronization for Android Phone Tightly-Coupled Visual-Inertial SLAM
    Feng, Zheyu
    Li, Jianwen
    Dai, Taogao
    CHINA SATELLITE NAVIGATION CONFERENCE (CSNC) 2018 PROCEEDINGS, VOL III, 2018, 499 : 601 - 612
  • [8] DiT-SLAM: Real-Time Dense Visual-Inertial SLAM with Implicit Depth Representation and Tightly-Coupled Graph Optimization
    Zhao, Mingle
    Zhou, Dingfu
    Song, Xibin
    Chen, Xiuwan
    Zhang, Liangjun
    SENSORS, 2022, 22 (09)
  • [9] VIP-SLAM: An Efficient Tightly-Coupled RGB-D Visual Inertial Planar SLAM
    Chen, Danpeng
    Wang, Shuai
    Xie, Weijian
    Zhai, Shangjin
    Wang, Nan
    Bao, Hujun
    Zhang, Guofeng
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 5615 - 5621
  • [10] Tightly-coupled stereo visual-inertial-LiDAR SLAM based on graph optimization
    Wang X.
    Li X.
    Liao J.
    Feng S.
    Li S.
    Zhou Y.
    Cehui Xuebao/Acta Geodaetica et Cartographica Sinica, 2022, 51 (08): : 1744 - 1756