An Fusion SLAM Method for LiDAR Visual and IMU Based on Factor Map Elimination Optimization

被引:0
|
作者
Yuan G.-S. [1 ]
Qi Y.-S. [1 ,2 ,3 ]
Liu L.-Q. [1 ,2 ,3 ]
Su J.-Q. [1 ,2 ]
Zhang L.-J. [1 ,2 ,3 ]
机构
[1] School of Electric Power, Inner Mongolia University of Technology, Inner Mongolia, Hohhot
[2] Center for Intelligent Energy Technology and Equipment Engineering, Inner Mongolia University, Inner Mongolia, Hohhot
[3] Engineering Research Center of Large Energy Storage Technology, Ministry of Education, Inner Mongolia, Hohhot
来源
基金
中国国家自然科学基金;
关键词
complex scene; factor graph optimization; IMU odometer; laser radar; multi-sensor fusion; simultaneous localization and mapping;
D O I
10.12263/DZXB.20230209
中图分类号
学科分类号
摘要
Addressing the limitations of single-sensor SLAM (Simultaneous Localization And Mapping) techniques, degraded perception, and poor reliability in complex environments, this paper proposes a multi-factor graph fusion SLAM algorithm with IMU as the dominant system (ID-MFG-SLAM). Firstly, the utilization of a multi-factor graph model, with the IMU (Inertial Measurement Unit) as the primary system and visual and LIDAR sensors as secondary systems. This nov⁃ el structure incorporates observation factors from the secondary systems to constrain IMU biases and integrates IMU odome⁃ try factors for motion prediction and fusion. To reduce the optimization cost after fusion, a sliding window mechanism is in⁃ troduced for historical state information backtracking. Additionally, a QR decomposition elimination method based on Householder transformation is employed to convert the factor graph into a Bayesian network, simplifying the graph's struc⁃ ture and improving computational efficiency. Furthermore, an adaptive interpolation algorithm between quaternion spheri⁃ cal linear interpolation and linear interpolation is introduced. This algorithm projects LIDAR point clouds onto a unit sphere, enabling depth estimation of visual feature points. The experimental results show that compared to other classic al⁃ gorithms, this method can achieve absolute trajectory errors of about 0.68 m and 0.24 m in complex large and small scenes, respectively, with higher accuracy and reliability. © 2023 Chinese Institute of Electronics. All rights reserved.
引用
收藏
页码:3042 / 3052
页数:10
相关论文
共 26 条
  • [1] BAILEY T, DURRANT-WHYTE H., Simultaneous localization and mapping (SLAM): Part II, IEEE Robotics & Automation Magazine, 13, 3, pp. 108-117, (2006)
  • [2] ZHANG J, SINGH S., Low-drift and real-time LiDAR odometry and mapping, Autonomous Robots, 41, 2, pp. 401-416, (2017)
  • [3] SHAN T X, ENGLOT B., LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain, 2018 IEEE/RSJ International Conference on Intel⁃ ligent Robots and Systems (IROS), pp. 4758-4765, (2019)
  • [4] MUR-ARTAL R, MONTIEL J M M, TARDOS J D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
  • [5] MUR-ARTAL R, TARDOS J D., ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras, IEEE Transactions on Robotics, 33, 5, pp. 1255-1262, (2017)
  • [6] TUBMAN R, POTGIETER J, ARIF K M., Efficient robotic SLAM by fusion of RatSLAM and RGBD-SLAM, 2016 23rd International Conference on Mechatronics and Machine Vision in Practice (M2VIP), pp. 1-6, (2017)
  • [7] LIU Z X, XIE C X, XIE M, Et al., Mobile robot positioning method based on multi-sensor information fusion laser SLAM, Cluster Computing, 22, 2, pp. 5055-5061, (2019)
  • [8] LIN J R, ZHENG C R, XU W, Et al., R<sup>2</sup>LIVE: A robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping, IEEE Robotics and Automation Letters, 6, 4, pp. 7469-7476, (2021)
  • [9] LIN J R, ZHANG F., R<sup>3</sup>LIVE: A Robust, Real-time, RGB-col⁃ ored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package, 2022 International Conference on Robotics and Automation (ICRA), pp. 10672-10678, (2022)
  • [10] SHAN T X, ENGLOT B, MEYERS D, Et al., LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135-5142, (2021)