Cubic B-Spline-Based Feature Tracking for Visual-Inertial Odometry With Event Camera

被引:0
|
作者
Liu, Xinghua [1 ]
Xue, Hanjun [1 ]
Gao, Xiang [1 ]
Liu, Han [2 ]
Chen, Badong [3 ]
Ge, Shuzhi Sam [4 ]
机构
[1] Xian Univ Technol, Sch Elect Engn, Xian 710048, Peoples R China
[2] Xian Univ Technol, Sch Automat & Informat Engn, Xian 710048, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Peoples R China
[4] Natl Univ Singapore, Sch Elect & Comp Engn, Singapore 117583, Singapore
基金
中国国家自然科学基金;
关键词
Cubic B-spline; dynamic and active-pixel vision sensor (DAVIS) camera; inertial measurement unit (IMU) data; trajectory estimation; visual-inertial odometry (VIO); OBSERVABILITY ANALYSIS; ROBUST; IMU; VERSATILE; SLAM;
D O I
10.1109/TIM.2023.3325508
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
It is challenging to obtain accurate trajectories with standard camera visual odometry (VO) in environments with weak textures and light variations. This article introduces a novel approach [cubic B-spline-based visual-inertial odometry (CB-VIO)], using the dynamic and active-pixel vision sensor (DAVIS) camera. In the proposed CB-VIO method, the matching mechanism between images and events is designed to improve the success rate of event tracking, based on which the template points from events are utilized to construct a cubic B-spline based event tracking model within a continuous spatiotemporal window [under SE(3)]. Based on the tracking model to interpolate poses at any time point, the inertial measurement unit (IMU) measurement model is constructed to achieve data fusion from asynchronous and synchronous sensors with different rates. Compared with the Spline-visual-inertial odometry (VIO) and the event-based VO (EVO), the proposed continuous spatiotemporal window method can effectively solve the data association for EVO and the continuous-time trajectory with fixed-time intervals for Spline-VIO. The experimental results are compared on public datasets of multiple different scenes, which demonstrate the superior performance of CB-VIO in terms of accuracy and robustness (translation error <= 1.3% and rotation error <= 2(degrees)).
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
    Zhang, Lintong
    Wisth, David
    Camurri, Marco
    Fallon, Maurice
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 1182 - 1189
  • [2] Stereo Event-Based Visual-Inertial Odometry
    Wang, Kunfeng
    Zhao, Kaichun
    Lu, Wenshuai
    You, Zheng
    SENSORS, 2025, 25 (03)
  • [3] Event-based feature tracking in a visual inertial odometry framework
    Ribeiro-Gomes, Jose
    Gaspar, Jose
    Bernardino, Alexandre
    FRONTIERS IN ROBOTICS AND AI, 2023, 10
  • [4] Continuous-Time Spline Visual-Inertial Odometry
    Mo, Jiawei
    Sattar, Junaed
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9492 - 9498
  • [5] ESVIO: Event-Based Stereo Visual-Inertial Odometry
    Liu, Zhe
    Shi, Dianxi
    Li, Ruihao
    Yang, Shaowu
    SENSORS, 2023, 23 (04)
  • [6] Contrast Maximization-Based Feature Tracking for Visual Odometry with an Event Camera
    Gao, Xiang
    Xue, Hanjun
    Liu, Xinghua
    PROCESSES, 2022, 10 (10)
  • [7] A fast initialization method of Visual-Inertial Odometry based on monocular camera
    Huang, Lixiao
    Pan, Shuguo
    Wang, Shuai
    Zeng, Pan
    Ye, Fei
    PROCEEDINGS OF 5TH IEEE CONFERENCE ON UBIQUITOUS POSITIONING, INDOOR NAVIGATION AND LOCATION-BASED SERVICES (UPINLBS), 2018, : 70 - 74
  • [8] Fast Visual-Inertial Odometry with Adaptive Feature Coupling
    Ma, Zekun
    Xiao, Jiazheng
    Wang, Fei
    Jiang, Peilin
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2024, PT VI, 2025, 15206 : 171 - 186
  • [9] Dense Visual-Inertial Odometry for Tracking of Aggressive Motions
    Ling, Yonggen
    Shen, Shaojie
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2015, : 576 - 583
  • [10] Continuous-Time Visual-Inertial Odometry for Event Cameras
    Mueggler, Elias
    Gallego, Guillermo
    Rebecq, Henri
    Scaramuzza, Davide
    IEEE TRANSACTIONS ON ROBOTICS, 2018, 34 (06) : 1425 - 1440