Lightweight hybrid visual-inertial odometry with closed-form zero velocity update

被引:0
|
作者
QIU Xiaochen [1 ,2 ]
ZHANG Hai [1 ,2 ]
FU Wenxing [3 ]
机构
[1] School of Automation Science and Electrical Engineering, Beihang University
[2] Science and Technology on Aircraft Control Laboratory
[3] Science and Technology on Complex System Control and Intelligent Agent Cooperation
关键词
D O I
暂无
中图分类号
TP212 [发送器(变换器)、传感器]; TN713 [滤波技术、滤波器];
学科分类号
080202 ;
摘要
Visual-Inertial Odometry(VIO) fuses measurements from camera and Inertial Measurement Unit(IMU) to achieve accumulative performance that is better than using individual sensors.Hybrid VIO is an extended Kalman filter-based solution which augments features with long tracking length into the state vector of Multi-State Constraint Kalman Filter(MSCKF). In this paper, a novel hybrid VIO is proposed, which focuses on utilizing low-cost sensors while also considering both the computational efficiency and positioning precision. The proposed algorithm introduces several novel contributions. Firstly, by deducing an analytical error transition equation, onedimensional inverse depth parametrization is utilized to parametrize the augmented feature state.This modification is shown to significantly improve the computational efficiency and numerical robustness, as a result achieving higher precision. Secondly, for better handling of the static scene,a novel closed-form Zero velocity UPda Te(ZUPT) method is proposed. ZUPT is modeled as a measurement update for the filter rather than forbidding propagation roughly, which has the advantage of correcting the overall state through correlation in the filter covariance matrix. Furthermore, online spatial and temporal calibration is also incorporated. Experiments are conducted on both public dataset and real data. The results demonstrate the effectiveness of the proposed solution by showing that its performance is better than the baseline and the state-of-the-art algorithms in terms of both efficiency and precision. A related software is open-sourced to benefit the community.①
引用
收藏
页码:3344 / 3359
页数:16
相关论文
共 50 条
  • [21] Lightweight omnidirectional visual-inertial odometry for MAVs based on improved keyframe tracking and marginalization
    Gao, Bo
    Lian, Baowang
    Tang, Chengkai
    TELECOMMUNICATION SYSTEMS, 2024, 87 (03) : 723 - 730
  • [22] LRPL-VIO: A Lightweight and Robust Visual-Inertial Odometry with Point and Line Features
    Zheng, Feixiang
    Zhou, Lu
    Lin, Wanbiao
    Liu, Jingyang
    Sun, Lei
    SENSORS, 2024, 24 (04)
  • [23] Direct Visual-Inertial Odometry with Stereo Cameras
    Usenko, Vladyslav
    Engel, Jakob
    Stueckler, Joerg
    Cremers, Daniel
    2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 1885 - 1892
  • [24] Visual-Inertial Odometry with Point and Line Features
    Yang, Yulin
    Geneva, Patrick
    Eckenhoff, Kevin
    Huang, Guoquan
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 2447 - 2454
  • [25] The First Attempt of SAR Visual-Inertial Odometry
    Liu, Junbin
    Qiu, Xiaolan
    Ding, Chibiao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (01): : 287 - 304
  • [26] Monocular Visual-Inertial Odometry for Agricultural Environments
    Song, Kaiyu
    Li, Jingtao
    Qiu, Run
    Yang, Gaidi
    IEEE ACCESS, 2022, 10 : 103975 - 103986
  • [27] ATVIO: ATTENTION GUIDED VISUAL-INERTIAL ODOMETRY
    Liu, Li
    Li, Ge
    Li, Thomas H.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4125 - 4129
  • [28] Aerial Visual-Inertial Odometry Performance Evaluation
    Carson, Daniel J.
    Raquet, John F.
    Kauffman, Kyle J.
    PROCEEDINGS OF THE ION 2017 PACIFIC PNT MEETING, 2017, : 137 - 154
  • [29] Pose estimation by Omnidirectional Visual-Inertial Odometry
    Ramezani, Milad
    Khoshelham, Kourosh
    Fraser, Clive
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2018, 105 : 26 - 37
  • [30] Challenges of Dynamic Environment for Visual-Inertial Odometry
    Zhu, Tao
    Ma, Huimin
    2018 3RD INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION ENGINEERING (ICRAE), 2018, : 82 - 86