INF: Implicit Neural Fusion for LiDAR and Camera

被引:3
|
作者
Zhou, Shuyi [1 ,2 ]
Xie, Shuxiang [1 ,2 ]
Ishikawa, Ryoichi [1 ]
Sakurada, Ken [2 ]
Onishi, Masaki [2 ]
Oishi, Takeshi [1 ]
机构
[1] Univ Tokyo, Inst Ind Sci, Tokyo, Japan
[2] Natl Inst Adv Ind Sci & Technol AIST Tokyo, Tokyo, Japan
来源
2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2023年
关键词
CALIBRATION;
D O I
10.1109/IROS55552.2023.10341648
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sensor fusion has become a popular topic in robotics. However, conventional fusion methods encounter many difficulties, such as data representation differences, sensor variations, and extrinsic calibration. For example, the calibration methods used for LiDAR-camera fusion often require manual operation and auxiliary calibration targets. Implicit neural representations (INRs) have been developed for 3D scenes, and the volume density distribution involved in an INR unifies the scene information obtained by different types of sensors. Therefore, we propose implicit neural fusion (INF) for LiDAR and camera. INF first trains a neural density field of the target scene using LiDAR frames. Then, a separate neural color field is trained using camera images and the trained neural density field. Along with the training process, INF both estimates LiDAR poses and optimizes extrinsic parameters. Our experiments demonstrate the high accuracy and stable performance of the proposed method.
引用
收藏
页码:10918 / 10925
页数:8
相关论文
共 50 条
  • [21] G-Fusion: LiDAR and Camera Feature Fusion on the Ground Voxel Space
    Cheng, Shuai
    Ning, Zuotao
    Hu, Jun
    Liu, Jiaxin
    Yang, Wenxing
    Wang, Luyang
    Yu, Hongfei
    Liu, Wei
    IEEE ACCESS, 2024, 12 : 4127 - 4138
  • [22] Implicit camera calibration by using resilient neural networks
    Civicioglu, Pinar
    Besdok, Erkan
    NEURAL INFORMATION PROCESSING, PT 2, PROCEEDINGS, 2006, 4233 : 632 - 640
  • [23] Implicit camera calibration using an artificial neural network
    Woo, Dong-Min
    Park, Dong-Chul
    NEURAL INFORMATION PROCESSING, PT 2, PROCEEDINGS, 2006, 4233 : 641 - 650
  • [24] Camera-Lidar sensor fusion for drivable area detection in winter weather using convolutional neural networks
    Rawashdeh, Nathir A.
    Bos, Jeremy P.
    Abu-Alrub, Nader J.
    OPTICAL ENGINEERING, 2023, 62 (03)
  • [25] SLAM Mapping of Information Fusion between Lidar and Depth Camera
    Wang, Zhihao
    Hao, Weidong
    Huang, Yiren
    Wu, Hao
    2022 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, COMPUTER VISION AND MACHINE LEARNING (ICICML), 2022, : 142 - 145
  • [26] Target Localization and Tracking Method Based on Camera and LiDAR Fusion
    Zhang Pu
    Liu Jinqing
    Xiao Jinchao
    Xiong Junfeng
    Feng Tianwei
    Wang Zhongze
    LASER & OPTOELECTRONICS PROGRESS, 2024, 61 (08)
  • [27] Object Tracking Based on the Fusion of Roadside LiDAR and Camera Data
    Wang, Shujian
    Pi, Rendong
    Li, Jian
    Guo, Xinming
    Lu, Youfu
    Li, Tao
    Tian, Yuan
    IEEE Transactions on Instrumentation and Measurement, 2022, 71
  • [28] LiDAR-Camera Fusion for Depth Enhanced Unsupervised Odometry
    Fetic, Naida
    Aydemir, Eren
    Unel, Mustafa
    2022 IEEE 95TH VEHICULAR TECHNOLOGY CONFERENCE (VTC2022-SPRING), 2022,
  • [29] Camera and LIDAR fusion for mapping of actively Illuminated subterranean voids
    Wong U.
    Garney B.
    Whittaker W.
    Whittaker R.
    Springer Tracts in Advanced Robotics, 2010, 62 : 421 - 430
  • [30] Raw fusion of camera and sparse LiDAR for detecting distant objects
    Rovid, Andras
    Remeli, Viktor
    Szalay, Zsolt
    AT-AUTOMATISIERUNGSTECHNIK, 2020, 68 (05) : 337 - 346