Sample Pose Augmentation and Adaptive Weight-Based Refinement for 3-D LiDAR-Camera Extrinsic Calibration Using an Orthogonal Trihedron

被引:2
|
作者
Choi, Yeongyu [1 ]
Park, Ju H. [2 ]
Jung, Ho-Youl [1 ]
机构
[1] Yeungnam Univ, Dept Informat & Commun Engn, Gyongsan 38541, South Korea
[2] Yeungnam Univ, Dept Elect Engn, Gyongsan 38541, South Korea
基金
新加坡国家研究基金会;
关键词
Extrinsic calibration; M-estimator sample consensus (MSAC); random sample consensus (RANSAC); sensor fusion; signal processing algorithms; ON-ORBIT CALIBRATION; STAR SENSOR CALIBRATION;
D O I
10.1109/TIM.2023.3336440
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Light detection and ranging (LiDAR) and cameras are core sensors used in autonomous vehicles and industrial robots. LiDAR-camera fusion systems require an accurate estimation of the relative pose to integrate different sensor data. We propose an offline method for 3-D LiDAR-camera extrinsic calibration using an orthogonal trihedron with checkered patterns on each plane. Our approach for LiDAR pose estimation consists of four steps: background rejection, perpendicularity enforcement, dominant pose decision, and refinement. In the iterations of the first and second steps, several poses are sampled. The sample poses are evaluated and augmented, then the highest scoring sample is determined as the dominant pose. For the refinement, a new loss function with adaptive weights is introduced, which is formulated as the minimization of the sum of the squared distance between points and the nearest plane on the target. The relative pose is estimated by solving the perspective-n-point (PnP) problem. Our experimental results through simulations in various noise scenarios show that the proposed method estimates the relative poses with higher accuracy and stability compared to existing methods, in terms of the mean and standard deviation of errors. The source code is available at https://github.com/ygchoi11/3DLiDAR-Camera_Calibration.
引用
收藏
页码:1 / 14
页数:14
相关论文
共 26 条
  • [1] 3D LIDAR-Camera Extrinsic Calibration Using an Arbitrary Trihedron
    Gong, Xiaojin
    Lin, Ying
    Liu, Jilin
    SENSORS, 2013, 13 (02) : 1902 - 1918
  • [2] Extrinsic calibration of a 3D LIDAR and a camera using a trihedron
    Gong, Xiaojin
    Lin, Ying
    Liu, Jilin
    OPTICS AND LASERS IN ENGINEERING, 2013, 51 (04) : 394 - 401
  • [3] Improvement to LiDAR-camera extrinsic calibration by using 3D-3D correspondences
    An Duy Nguyen
    Tri Minh Nguyen
    Yoo, Myungsik
    OPTIK, 2022, 259
  • [4] Extrinsic Calibration for LiDAR-Camera Systems Using Direct 3D-2D Correspondences
    Yi, Hao
    Liu, Bo
    Zhao, Bin
    Liu, Enhai
    REMOTE SENSING, 2022, 14 (23)
  • [5] Extrinsic Calibration of a 2D Laser Rangefinder and a Depth-camera Using an Orthogonal Trihedron
    Li, Zhengbin
    Dong, Haiqing
    Liu, Dong
    Ding, Yabin
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 6264 - 6269
  • [6] High efficient extrinsic parameter calibration method of 3D LiDAR-camera system
    Liu J.
    Tang X.
    Jia X.
    Yang D.
    Li T.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2019, 40 (11): : 64 - 72
  • [7] 3D LIDAR-camera intrinsic and extrinsic calibration: Identifiability and analytical least-squares-based initialization
    Mirzaei, Faraz M.
    Kottas, Dimitrios G.
    Roumeliotis, Stergios I.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2012, 31 (04): : 452 - 467
  • [8] Targetless Extrinsic Calibration of Camera and Low-Resolution 3-D LiDAR
    Ou, Ni
    Cai, Hanyu
    Yang, Jiawen
    Wang, Junzheng
    IEEE SENSORS JOURNAL, 2023, 23 (10) : 10889 - 10899
  • [9] Improved LiDAR-Camera Calibration Using Marker Detection Based on 3D Plane Extraction
    Yoo, Joong-Sun
    Kim, Do-Hyeong
    Kim, Gon-Woo
    JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, 2018, 13 (06) : 2530 - 2544
  • [10] Single Frame Lidar-Camera Calibration Using Registration of 3D Planes
    Singandhupe, Ashutosh
    La, Hung Manh
    Ha, Quang Phuc
    2022 SIXTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC, 2022, : 395 - 402