Generation and Validation of Virtual Point Cloud Data for Automated Driving Systems

被引:0
|
作者
Hanke, Timo [1 ,2 ]
Schaermann, Alexander [1 ,2 ]
Geiger, Matthias [1 ]
Weiler, Konstantin [1 ]
Hirsenkorn, Nils [2 ]
Rauch, Andreas [1 ]
Schneider, Stefan-Alexander [3 ]
Biebl, Erwin [2 ]
机构
[1] BMW AG, D-80788 Munich, Germany
[2] Tech Univ Munich, Associate Professorship Microwave Engn, Arcisstr 21, D-80333 Munich, Germany
[3] Kempten Univ Appl Sci, Fac Elect Engn, Bahnhofstr 61, D-87435 Kempten, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The performance of an automated driving system is crucially affected by its environmental perception. The vehicle's perception of its environment provides the foundation for the automated responses computed by the system's logic algorithms. As perception relies on the vehicle's sensors, simulating sensor behavior in a virtual world constitutes virtual environmental perception. This is the task performed by sensor models. In this work, we introduce a real-time capable model of the measurement process for an automotive lidar sensor employing a ray tracing approach. The output of the model is point cloud data based on the geometry and material properties of the virtual scene. With this low level sensor data as input, a vehicle internal representation of the environment is constructed by means of an occupancy grid mapping algorithm. By using a virtual environment that has been constructed from high-fidelity measurements of a real world scenario, we are able to establish a direct link between real and virtual world sensor data. Directly comparing the resulting sensor output and environment representations from both cases, we are able to quantitatively explore the validity and fidelity of the proposed sensor measurement model.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] AI-Based Point Cloud Upsampling for Autonomous Driving Systems
    Salomon, Nicolas
    Delrieux, Claudio A.
    Borgnino, Leandro E.
    Morero, Damian A.
    2024 L LATIN AMERICAN COMPUTER CONFERENCE, CLEI 2024, 2024,
  • [32] A Location Cloud for Highly Automated Driving
    Redzic, Ogi
    Rabel, Dietmar
    Road Vehicle Automation 2, 2015, : 49 - 60
  • [33] Point Cloud Information Modeling: Deep Learning Based Automated Information Modeling Framework for Point Cloud Data
    Park, Jisoo
    Cho, Yong K.
    JOURNAL OF CONSTRUCTION ENGINEERING AND MANAGEMENT, 2022, 148 (02)
  • [34] AUTOMATED GENERATION OF VIRTUAL ROADWAYS BASED ON GEOGRAPHIC INFORMATION SYSTEMS
    Kreft, Sven
    Gausemeier, Juergen
    Grafe, Michael
    Hassan, Bassem
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2011, VOL 2, PTS A AND B, 2012, : 1525 - 1531
  • [35] Virtual Test of Automated Driving Functions
    Butz, Torsten
    Paleduhn, Simon
    Merkel, Alexander
    Bohner, Christian
    ATZ worldwide, 2020, 122 (05) : 16 - 21
  • [36] Virtual access point to the Cloud
    Braham, Othmen
    Pujolle, Guy
    2012 IEEE 1ST INTERNATIONAL CONFERENCE ON CLOUD NETWORKING (CLOUDNET), 2012,
  • [37] Model-Based Virtual Visual Servoing With Point Cloud Data
    Kingkan, Cherdsak
    Ito, Shogo
    Arai, Shogo
    Nammoto, Takashi
    Hashimoto, Koichi
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 5549 - 5555
  • [38] Vehicle simulation model chain for virtual testing of automated driving functions and systems
    Bartolozzi, R.
    Landersheim, V
    Stoll, G.
    Holzmann, H.
    Moller, R.
    Atzrodt, H.
    2022 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2022, : 1054 - 1059
  • [39] Bilateral filter denoising of Lidar point cloud data in automatic driving scene
    Wen, Guoqiang
    Zhang, Hongxia
    Guan, Zhiwei
    Su, Wei
    Jia, Dagong
    INFRARED PHYSICS & TECHNOLOGY, 2023, 131
  • [40] Path Planning of Urban Autonomous Driving Using Laser Point Cloud Data
    Guo X.-M.
    Li B.-J.
    Long J.-Y.
    Xu H.-D.
    Lu Z.
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2020, 33 (04): : 182 - 190