Analysis on Pose Estimation Uncertainty and Observation Parametrization for RGB-D Cameras

被引:0
|
作者
Ma X. [1 ]
Liang X. [1 ]
机构
[1] School of Aeronautics and Astronautics, Shanghai Jiao Tong University, Shanghai
来源
Jiqiren/Robot | 2021年 / 43卷 / 01期
关键词
Observation parametrization; Pose estimation uncertainty; RGB-D; Simultaneous localization and mapping (SLAM);
D O I
10.13973/j.cnki.robot.200026
中图分类号
学科分类号
摘要
For the pose estimation based on RGB-D cameras, the pose estimation uncertainties of the ICP (iterative closest point) and PnP (perspective-n-point) algorithms are analyzed according to the maximum likelihood estimation and covariance propagation law, and the differences between the two algorithms in practical applications are explored from the perspective of data association. In addition, the influence of different parametrization forms of feature observation on the localization accuracy of visual odometer is compared, and a parametrization method is proposed. Based on the difference and connection between ICP and PnP algorithms, a switching strategy for the two PnP algorithms in the RGB-D SLAM (simultaneous localization and mapping) system is proposed, and different weights are assigned to each error term in the optimization problem according to the uncertainty of the observation. Experiments on two public datasets show that compared with the mainstream RGB-D SLAM algorithms, the proposed algorithm has higher localization accuracy and robustness in a variety of scenes with low texture, fast camera movement or dynamic objects. Meanwhile, the time efficiency of the proposed algorithm is about 10% higher than ORB-SLAM2 algorithm. © 2021, Science Press. All right reserved.
引用
收藏
页码:54 / 65and73
页数:6519
相关论文
共 26 条
  • [1] Klein G, Murray D., Parallel tracking and mapping for small AR workspaces, IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 225-234, (2007)
  • [2] Mur-Artal R, Montiel J M M, Tardos J D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
  • [3] Engel J, Schops T, Cremers D., LSD-SLAM: Large-scale direct monocular SLAM, Lecture Notes in Computer Science, 8690, pp. 834-849, (2014)
  • [4] Engel J, Koltun V, Cremers D., Direct sparse odometry, IEEE Transactions on Pattern Analysis and Machine Intelligence, 40, 3, pp. 611-625, (2017)
  • [5] Kerl C, Sturm J, Cremers D., Robust odometry estimation for RGB-D cameras, IEEE International Conference on Robotics and Automation, pp. 3748-3754, (2013)
  • [6] Kerl C, Sturm J, Cremers D., Dense visual SLAM for RGB-D cameras, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2100-2106, (2013)
  • [7] Newcombe R A, Izadi S, Hilliges O, Et al., KinectFusion: Realtime dense surface mapping and tracking, 10th IEEE International Symposium on Mixed and Augmented Reality, pp. 127-136, (2011)
  • [8] Endres F, Hess J, Sturm J, Et al., 3-D mapping with an RGB-D camera, IEEE Transactions on Robotics, 30, 1, pp. 177-187, (2014)
  • [9] Whelan T, Salas-Moreno R F, Glocker B, Et al., ElasticFusion: Real-time dense SLAM and light source estimation, International Journal of Robotics Research, 35, 14, pp. 1697-1716, (2016)
  • [10] Liu H M, Li C, Chen G J, Et al., Robust keyframe-based dense SLAM with an RGB-D camera