Visual-Inertial Mapping With Non-Linear Factor Recovery

被引:136
|
作者
Usenko, Vladyslav [1 ]
Demmel, Nikolaus [1 ]
Schubert, David [1 ]
Stueckler, Joerg [2 ]
Cremers, Daniel [1 ]
机构
[1] Tech Univ Munich, D-80333 Munich, Germany
[2] MPI Intelligent Syst, D-72076 Tubingen, Germany
来源
关键词
Simultaneous localization and mapping; sensor fusion; KALMAN FILTER;
D O I
10.1109/LRA.2019.2961227
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Cameras and inertial measurement units are complementary sensors for ego-motion estimation and environment mapping. Their combination makes visual-inertial odometry (VIO) systems more accurate and robust. For globally consistent mapping, however, combining visual and inertial information is not straightforward. To estimate the motion and geometry with a set of images large baselines are required. Because of that, most systems operate on keyframes that have large time intervals between each other. Inertial data on the other hand quickly degrades with the duration of the intervals and after several seconds of integration, it typically contains only little useful information. In this letter, we propose to extract relevant information for visual-inertial mapping from visual-inertial odometry using non-linear factor recovery. We reconstruct a set of non-linear factors that make an optimal approximation of the information on the trajectory accumulated by VIO. To obtain a globally consistent map we combine these factors with loop-closing constraints using bundle adjustment. The VIO factors make the roll and pitch angles of the global map observable, and improve the robustness and the accuracy of the mapping. In experiments on a public benchmark, we demonstrate superior performance of our method over the state-of-the-art approaches.
引用
收藏
页码:422 / 429
页数:8
相关论文
共 50 条
  • [21] Superpixels Using Binary Images for Monocular Visual-Inertial Dense Mapping
    Yathirajam, Bharadwaja
    Sevoor Meenakshisundaram, Vaitheeswaran
    Challaghatta Muniyappa, Ananda
    JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2022, 94 (12): : 1485 - 1505
  • [22] Cooperative Visual-Inertial Odometry
    Zhu, Pengxiang
    Yang, Yulin
    Ren, Wei
    Huang, Guoquan
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 13135 - 13141
  • [23] A Linear-Complexity EKF for Visual-Inertial Navigation with Loop Closures
    Geneva, Patrick
    Eckenhoff, Kevin
    Huang, Guoquan
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 3535 - 3541
  • [24] GNSS Reconstrainted Visual-Inertial Odometry System Using Factor Graphs
    Chen, Yu
    Xu, Bo
    Wang, Bin
    Na, Jiaming
    Yang, Pei
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [25] ON NON-LINEAR INTERACTION OF INERTIAL MODES
    GREENSPAN, HP
    JOURNAL OF FLUID MECHANICS, 1969, 36 : 257 - +
  • [26] Robust Visual-Inertial Odometry Based on a Kalman Filter and Factor Graph
    Wang, Zhiwei
    Pang, Bao
    Song, Yong
    Yuan, Xianfeng
    Xu, Qingyang
    Li, Yibin
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (07) : 7048 - 7060
  • [27] GNSS Reconstrainted Visual-Inertial Odometry System Using Factor Graphs
    Chen, Yu
    Xu, Bo
    Wang, Bin
    Na, Jiaming
    Yang, Pei
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [28] Model-Aided Monocular Visual-Inertial State Estimation and Dense Mapping
    Qiu, Kejie
    Shen, Shaojie
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 1783 - 1789
  • [29] Three Tiered Visual-Inertial Tracking and Mapping for Augmented Reality in Urban Settings
    Calloway, Thomas
    Megherbi, Dalila B.
    2020 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND VIRTUAL ENVIRONMENTS FOR MEASUREMENT SYSTEMS AND APPLICATIONS (CIVEMSA 2020), 2020,
  • [30] On Data Sharing Strategy for Decentralized Collaborative Visual-Inertial Simultaneous Localization And Mapping
    Dubois, Rodolphe
    Eudes, Alexandre
    Fremont, Vincent
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 2123 - 2130