A Tightly Coupled Visual-Inertial GNSS State Estimator Based on Point-Line Feature

被引:4
|
作者
Dong, Bo [1 ]
Zhang, Kai [1 ,2 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Shenzhen 518055, Peoples R China
[2] Res Inst Tsinghua, Guangzhou 510530, Guangdong, Peoples R China
关键词
GNSS-VIO; line feature; carrier phase smoothed pseudorange; parameter calibration; observability; VINS; VERSATILE; SLAM;
D O I
10.3390/s22093391
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Visual-inertial odometry (VIO) is known to suffer from drifting and can only provide local coordinates. In this paper, we propose a tightly coupled GNSS-VIO system based on point-line features for robust and drift-free state estimation. Feature-based methods are not robust in complex areas such as weak or repeated textures. To deal with this problem, line features with more environmental structure information can be extracted. In addition, to eliminate the accumulated drift of VIO, we tightly fused the GNSS measurement with visual and inertial information. The GNSS pseudorange measurements are real-time and unambiguous but experience large errors. The GNSS carrier phase measurements can achieve centimeter-level positioning accuracy, but the solution to the whole-cycle ambiguity is complex and time-consuming, which degrades the real-time performance of a state estimator. To combine the advantages of the two measurements, we use the carrier phase smoothed pseudorange instead of pseudorange to perform state estimation. Furthermore, the existence of the GNSS receiver and IMU also makes the extrinsic parameter calibration crucial. Our proposed system can calibrate the extrinsic translation parameter between the GNSS receiver and IMU in real-time. Finally, we show that the states represented in the ECEF frame are fully observable, and the tightly coupled GNSS-VIO state estimator is consistent. We conducted experiments on public datasets. The experimental results demonstrate that the positioning precision of our system is improved and the system is robust and real-time.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] A Visual-Inertial Navigation Coupled Localization Method Based on Adaptive Point-Line Feature Extraction
    He, Ziqi
    Li, Gang
    IEEE SENSORS JOURNAL, 2023, 23 (20) : 25096 - 25104
  • [2] Point-Line Visual-Inertial Odometry With Optimized Line Feature Processing
    Si, Hanqian
    Yu, Huai
    Chen, Kuangyi
    Yang, Wen
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [3] EM-LSD-Based Visual-Inertial Odometry With Point-Line Feature
    Hu, Chunhe
    Zhang, Xu
    Li, Kai
    Wu, Kun
    Dong, Ruifang
    IEEE SENSORS JOURNAL, 2023, 23 (24) : 30794 - 30804
  • [4] Real-Time Visual-Inertial Odometry Based on Point-Line Feature Fusion
    Yang G.
    Meng W.D.
    Hou G.D.
    Feng N.N.
    Gyroscopy and Navigation, 2023, 14 (4) : 339 - 352
  • [5] A Visual-Inertial Localization with Point-Line Joint Constraint
    Wei, Hongyu
    Zhang, Tao
    2022 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION ENGINEERING, ICRAE, 2022, : 65 - 69
  • [6] Mobile robot localization method based on point-line feature visual-inertial SLAM algorithm
    Xu, Jintao
    Fang, Yu
    Gao, Weiwei
    Liu, Xintian
    Shi, Juanjuan
    Yang, Hao
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2024,
  • [7] PLI-VINS: Visual-Inertial SLAM Based on Point-Line Feature Fusion in Indoor Environment
    Zhao, Zhangzhen
    Song, Tao
    Xing, Bin
    Lei, Yu
    Wang, Ziqin
    SENSORS, 2022, 22 (14)
  • [8] A Fast Point-Line Visual-Inertial Odometry with Structural Regularity
    Liu, Xuefeng
    Wang, Huimin
    Yang, Shijie
    2023 IEEE 2ND INDUSTRIAL ELECTRONICS SOCIETY ANNUAL ON-LINE CONFERENCE, ONCON, 2023,
  • [9] Tightly-coupled GNSS-aided Visual-Inertial Localization
    Lee, Woosik
    Geneva, Patrick
    Yang, Yulin
    Huang, Guoquan
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9484 - 9491
  • [10] Visual-inertial fusion positioning and mapping method based on point-line features
    Feng, Qinghua
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2022, 70 (02) : 113 - 119