KO-Fusion: Dense Visual SLAM with Tightly-Coupled Kinematic and Odometric Tracking

被引:0
|
作者
Houscago, Charlie [1 ]
Bloesch, Michael [1 ]
Leutenegger, Stefan [1 ]
机构
[1] Imperial Coll London, Dept Comp, Imperial Coll, Dyson Robot Lab, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/icra.2019.8793471
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dense visual SLAM methods are able to estimate the 3D structure of an environment and locate the observer within them. They estimate the motion of a camera by matching visual information between consecutive frames, and are thus prone to failure under extreme motion conditions or when observing texture-poor regions. The integration of additional sensor modalities has shown great promise in improving the robustness and accuracy of such SLAM systems. In contrast to the popular use of inertial measurements we propose to tightly-couple a dense RGB-D SLAM system with kinematic and odometry measurements from a wheeled robot equipped with a manipulator. The system has real-time capability while running on GPU. It optimizes the camera pose by considering the geometric alignment of the map as well as kinematic and odometric data from the robot. Through experimentation in the real-world, we show that the system is more robust to challenging trajectories featuring fast and loopy motion than the equivalent system without the additional kinematic and odometric knowledge, whilst retaining comparable performance to the equivalent RGB-D only system on easy trajectories.
引用
收藏
页码:4054 / 4060
页数:7
相关论文
共 50 条
  • [21] Tightly-Coupled Monocular Visual-Inertial Fusion for Autonomous Flight of Rotorcraft MAVs
    Shen, Shaojie
    Michael, Nathan
    Kumar, Vijay
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 5303 - 5310
  • [22] A tightly-coupled LIDAR-IMU SLAM method for quadruped robots
    Zhou, Zhifeng
    Zhang, Chunyan
    Li, Chenchen
    Zhang, Yi
    Shi, Yun
    Zhang, Wei
    MEASUREMENT & CONTROL, 2024, 57 (07): : 1004 - 1013
  • [23] Keyframe-Based Tightly-Coupled SLAM with RGBD Camera and IMU
    Zhang, Yiming
    Li, Kui
    Wang, Wei
    PROCEEDINGS OF THE 2016 4TH INTERNATIONAL CONFERENCE ON MACHINERY, MATERIALS AND INFORMATION TECHNOLOGY APPLICATIONS, 2016, 71 : 481 - 488
  • [24] A Tightly-coupled Semantic SLAM System with Visual, Inertial and Surround-view Sensors for Autonomous Indoor Parking
    Shao, Xuan
    Zhang, Lin
    Zhang, Tianjun
    Shen, Ying
    Li, Hongyu
    Zhou, Yicong
    MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 2691 - 2699
  • [25] Tightly-Coupled Model Aided Visual-Inertial Fusion for Quadrotor Micro Air Vehicles
    Abeywardena, Dinuka
    Dissanayake, Gamini
    FIELD AND SERVICE ROBOTICS, 2015, 105 : 153 - 166
  • [26] Tightly-Coupled Magneto-Visual-Inertial Fusion for Long Term Localization in Indoor Environment
    Coulin, Jade
    Guillemard, Richard
    Gay-Bellile, Vincent
    Joly, Cyril
    de la Fortelle, Arnaud
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02): : 952 - 959
  • [27] Tightly-Coupled Visual-Inertial-Pressure Fusion Using Forward and Backward IMU Preintegration
    Hu, Chao
    Zhu, Shiqiang
    Liang, Yiming
    Song, Wei
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03): : 6790 - 6797
  • [28] Hybrid-VINS: Underwater Tightly Coupled Hybrid Visual Inertial Dense SLAM for AUV
    Ou, Yaming
    Fan, Junfeng
    Zhou, Chao
    Zhang, Pengju
    Hou, Zeng-Guang
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2025, 72 (03) : 2821 - 2831
  • [29] A tightly-coupled dense monocular Visual-Inertial Odometry system with lightweight depth estimation network
    Wang, Xin
    Zhang, Zuoming
    Li, Luchen
    APPLIED SOFT COMPUTING, 2025, 171
  • [30] Tightly-Coupled Fusion of VINS and Motion Constraint for Autonomous Vehicle
    Yu, Zhelin
    Zhu, Lidong
    Lu, Guoyu
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (06) : 5799 - 5810