GADA-SegNet: gated attentive domain adaptation network for semantic segmentation of LiDAR point clouds

被引:0
|
作者
Xin Kong
Shifeng Xia
Ningzhong Liu
Mingqing Wei
机构
[1] Nanjing University of Aeronautics and Astronautics,College of Computer Science and Technology
[2] MIIT Key Laboratory of Pattern Analysis and Machine Intelligence Collaborative Innovation Center of Novel Software Technology and Industrialization,undefined
关键词
Point cloud semantic segmentation; Unsupervised domain adaptation; Adversarial learning;
D O I
暂无
中图分类号
学科分类号
摘要
We propose GADA-SegNet, a gated attentive domain adaptation network for semantic segmentation of LiDAR point clouds. Unlike most of existing methods that learn fully from point-wise annotations, our GADA-SegNet attempts to learn from labeled data first and then transfer itself smoothly to unlabeled data. We have three key contributions to bridge the domain gap between the labeled data and the unlabeled yet unseen data. First, we design a new gated connection module that can filter out noise and domain-private features from the low-level features, for better high- and low-level feature fusion. Second, we introduce a multi-scale attention module that can ease the large-scale variation of objects and class imbalance in complex scenes to reduce the class-level domain gap. Third, we develop a shared domain discriminator to implement the class-level domain discrimination for large-scale LiDAR point clouds. Experiments on both synthetic-to-real and real-to-real scenarios show clear improvements of our GADA-SegNet over its competitors.
引用
收藏
页码:2471 / 2481
页数:10
相关论文
共 50 条
  • [41] A point-based deep learning network for semantic segmentation of MLS point clouds
    Han, Xu
    Dong, Zhen
    Yang, Bisheng
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2021, 175 : 199 - 214
  • [42] Domain Adaptation in LiDAR Semantic Segmentation via Hybrid Learning with Alternating Skip Connections
    Corral-Soto, Eduardo R.
    Rochan, Mrigank
    He, Yannis Y.
    Chen, Xingxin
    Aich, Shubhra
    Liu Bingbing
    2023 IEEE INTELLIGENT VEHICLES SYMPOSIUM, IV, 2023,
  • [43] CoSMix: Compositional Semantic Mix for Domain Adaptation in 3D LiDAR Segmentation
    Saltori, Cristiano
    Galasso, Fabio
    Fiameni, Giuseppe
    Sebe, Nicu
    Ricci, Elisa
    Poiesi, Fabio
    COMPUTER VISION - ECCV 2022, PT XXXIII, 2022, 13693 : 586 - 602
  • [44] DALi: Domain Adaptation in LiDAR Point Clouds for 3D Obstacle Detection
    Cortes, Irene
    Beltran, Jorge
    de la Escalera, Arturo
    Garcia, Fernando
    2022 IEEE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2022, : 3837 - 3842
  • [45] Real-Time Semantic Segmentation of LiDAR Point Clouds on Edge Devices for Unmanned Systems
    Wang, Fei
    Wu, Zhao
    Yang, Yujie
    Li, Wanyu
    Liu, Yisha
    Zhuang, Yan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [46] Unsupervised scene adaptation for semantic segmentation of urban mobile laser scanning point clouds
    Luo, Haifeng
    Khoshelham, Kourosh
    Fang, Lina
    Chen, Chongcheng
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2020, 169 (169) : 253 - 267
  • [47] Learning local contextual features for 3D point clouds semantic segmentation by attentive kernel convolution
    Guofeng Tong
    Yuyuan Shao
    Hao Peng
    The Visual Computer, 2024, 40 (2) : 831 - 847
  • [48] FeatDANet: Feature-level Domain Adaptation Network for Semantic Segmentation
    Li, Jiao
    Shi, Wenjun
    Zhu, Dongchen
    Zhang, Guanghui
    Zhang, Xiaolin
    Li, Jiamao
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 3873 - 3880
  • [49] Bilinear Distance Feature Network for Semantic Segmentation in PowerLine Corridor Point Clouds
    Zhou, Yunyi
    Feng, Ziyi
    Chen, Chunling
    Yu, Fenghua
    SENSORS, 2024, 24 (15)
  • [50] A Neural Network Based System for Efficient Semantic Segmentation of Radar Point Clouds
    Cennamo, Alessandro
    Kaestner, Florian
    Kummert, Anton
    NEURAL PROCESSING LETTERS, 2021, 53 (05) : 3217 - 3235