共 26 条
- [1] BAILEY T, DURRANT-WHYTE H., Simultaneous localization and mapping (SLAM): Part II, IEEE Robotics & Automation Magazine, 13, 3, pp. 108-117, (2006)
- [2] ZHANG J, SINGH S., Low-drift and real-time LiDAR odometry and mapping, Autonomous Robots, 41, 2, pp. 401-416, (2017)
- [3] SHAN T X, ENGLOT B., LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain, 2018 IEEE/RSJ International Conference on Intel⁃ ligent Robots and Systems (IROS), pp. 4758-4765, (2019)
- [4] MUR-ARTAL R, MONTIEL J M M, TARDOS J D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
- [5] MUR-ARTAL R, TARDOS J D., ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras, IEEE Transactions on Robotics, 33, 5, pp. 1255-1262, (2017)
- [6] TUBMAN R, POTGIETER J, ARIF K M., Efficient robotic SLAM by fusion of RatSLAM and RGBD-SLAM, 2016 23rd International Conference on Mechatronics and Machine Vision in Practice (M2VIP), pp. 1-6, (2017)
- [7] LIU Z X, XIE C X, XIE M, Et al., Mobile robot positioning method based on multi-sensor information fusion laser SLAM, Cluster Computing, 22, 2, pp. 5055-5061, (2019)
- [8] LIN J R, ZHENG C R, XU W, Et al., R<sup>2</sup>LIVE: A robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping, IEEE Robotics and Automation Letters, 6, 4, pp. 7469-7476, (2021)
- [9] LIN J R, ZHANG F., R<sup>3</sup>LIVE: A Robust, Real-time, RGB-col⁃ ored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package, 2022 International Conference on Robotics and Automation (ICRA), pp. 10672-10678, (2022)
- [10] SHAN T X, ENGLOT B, MEYERS D, Et al., LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135-5142, (2021)