LIGHTWEIGHT AND FAST MATCHING METHOD FOR LIDAR-INERTIAL ODOMETRY AND MAPPING

被引:0
|
作者
Li, Chuanjiang [1 ]
Hu, Ziwei [1 ]
Zhu, Yanfei [1 ]
Ji, Xingzhao [1 ]
Zhang, Chongming [1 ]
Qi, Ziming [2 ]
机构
[1] Shanghai Normal Univ, Coll Informat Mech & Elect Engn, Shanghai, Peoples R China
[2] Manukau Inst Technol, New Zealand Maritime Sch, Auckland, New Zealand
来源
关键词
SLAM; location; keyframe; mapping;
D O I
10.2316/J.2024.206-0880
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents lightweight and fast Lidar-inertial odometry (LF-LIO) for a robot's real-time pose estimation in an unknown complex environment. This system includes prediction, odometry, mapping, and trajectory optimisation modules. In the prediction module, the initial value of the odometer's motion is calculated by inertial measurement unit (IMU) pre-integration and the state of the previous moment, the odometry then employs a scan-to-submap matching method based on ground segmentation and optimisation proposed by this paper to estimate the pose transformation between consecutive frames. To ensure high performance in real-time, a keyframe map is created instead of a full map. When updating incrementally the efficiency of the map is improved, meanwhile an efficient dynamic sliding window is proposed to manage sub- maps. We compare the performance of LF-LIO with three methods, Lidar odometry and mapping in real-time (LOAM), lightweight and ground-optimised Lidar odometry and mapping on variable terrain (LeGO-LOAM), and fast direct Lidar-inertial odometry (Fast-LIO2), using KITTI datasets, the contrasted results of the application indicate that the proposed LF-LIO method has better accuracy with a reduced computational burden.
引用
收藏
页码:338 / 348
页数:11
相关论文
共 50 条
  • [21] Versatile LiDAR-Inertial Odometry With SE(2) Constraints for Ground Vehicles
    Chen, Jiaying
    Wang, Han
    Hu, Minghui
    Suganthan, Ponnuthurai Nagaratnam
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (06) : 3486 - 3493
  • [22] Fast and Robust LiDAR-Inertial Odometry by Tightly-Coupled Iterated Kalman Smoother and Robocentric Voxels
    Liu, Jun
    Zhang, Yunzhou
    Zhao, Xiaoyu
    He, Zhengnan
    Liu, Wei
    Lv, Xiangren
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (10) : 14486 - 14496
  • [23] AdaLIO: Robust Adaptive LiDAR-Inertial Odometry in Degenerate Indoor Environments
    Lim, Hyungtae
    Kim, Daebeom
    Kim, Beomsoo
    Myung, Hyun
    2023 20TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS, UR, 2023, : 48 - 53
  • [24] TransFusionOdom: Transformer-Based LiDAR-Inertial Fusion Odometry Estimation
    Sun, Leyuan
    Ding, Guanqun
    Qiu, Yue
    Yoshiyasu, Yusuke
    Kanehiro, Fumio
    IEEE SENSORS JOURNAL, 2023, 23 (18) : 22064 - 22079
  • [25] Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping
    Park, Eun-Seok
    Arshad, Saba
    Park, Tae-Hyoung
    SENSORS, 2024, 24 (24)
  • [26] Faster-LIO: Lightweight Tightly Coupled Lidar-Inertial odometry Using Parallel Sparse Incremental Voxels
    Bai, Chunge
    Xiao, Tao
    Chen, Yajie
    Wang, Haoqian
    Zhang, Fang
    Gao, Xiang
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02): : 4861 - 4868
  • [27] FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter
    Xu, Wei
    Zhang, Fu
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (02) : 3317 - 3324
  • [28] LiDAR-Inertial Based Navigation and Mapping for Precision Landing
    Setterfield, Timothy P.
    Hewitt, Robert A.
    Chen, Po-Ting
    Espinoza, Antonio Teran
    Trawny, Nikolas
    Katake, Anup
    2021 IEEE AEROSPACE CONFERENCE (AEROCONF 2021), 2021,
  • [29] Fast Lidar Inertial Odometry and Mapping for Mobile Robot SE(2) Navigation
    Chen, Wei
    Sun, Jian
    APPLIED SCIENCES-BASEL, 2023, 13 (17):
  • [30] A High-Precision LiDAR-Inertial Odometry via Invariant Extended Kalman Filtering and Efficient Surfel Mapping
    Zhang, Houzhan
    Xiao, Rong
    Li, Jiaxin
    Yan, Chuangye
    Tang, Huajin
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 11