DGMLP: Deformable Gating MLP Sharing for Multi-Task Learning

被引:0
|
作者
Xu, Yangyang [1 ]
Zhang, Lefei [1 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan, Peoples R China
来源
ARTIFICIAL INTELLIGENCE, CICAI 2022, PT I | 2022年 / 13604卷
关键词
Scene understanding; Multi-task learning; Dense prediction;
D O I
10.1007/978-3-031-20497-5_10
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the development of deep learning, many dense prediction tasks have been significantly improved. In this work, we introduce a DGMLP model that jointly learns multiple dense prediction tasks in a unified multi-task learning architecture that is trained end-to-end. Specifically, the DGMLP consists of (i) a spatial deformable MLP to capture the valuable spatial information for different tasks and (ii) a spatial gating MLP to learn the shared feature across all the tasks. Deformable MLP can adaptively adjust the receptive field and sample valuable locations in this approach. In addition, the Gating MLP is adopted to learn task-relevant features for each task. We take advantage of the spatial deformable MLP and spatial gating MLP to build a new MLP-like architecture that is especially simple and effective for multiple visual dense prediction tasks. We provide extensive experiments and evaluations to verify the advantages of our approach, and the extensive experiments demonstrate the superiority of the proposed framework over state-of-the-art methods.
引用
收藏
页码:117 / 128
页数:12
相关论文
共 50 条
  • [1] Multi-task learning with deformable convolution
    Li, Jie
    Huang, Lei
    Wei, Zhiqiang
    Zhang, Wenfeng
    Qin, Qibing
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 77
  • [2] Fitting and sharing multi-task learning
    Piao, Chengkai
    Wei, Jinmao
    APPLIED INTELLIGENCE, 2024, 54 (9-10) : 6918 - 6929
  • [3] Task Adaptive Parameter Sharing for Multi-Task Learning
    Wallingford, Matthew
    Li, Hao
    Achille, Alessandro
    Ravichandran, Avinash
    Fowlkes, Charless
    Bhotika, Rahul
    Soatto, Stefano
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 7551 - 7560
  • [4] DeMT: Deformable Mixer Transformer for Multi-Task Learning of Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3072 - 3080
  • [5] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [6] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [7] Conservative Data Sharing for Multi-Task Offline Reinforcement Learning
    Yu, Tianhe
    Kumar, Aviral
    Chebotar, Yevgen
    Hausman, Karol
    Levine, Sergey
    Finn, Chelsea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Adaptively sharing multi-levels of distributed representations in multi-task learning
    Wang, Tianxin
    Zhuang, Fuzhen
    Sun, Ying
    Zhang, Xiangliang
    Lin, Leyu
    Xia, Feng
    He, Lei
    He, Qing
    INFORMATION SCIENCES, 2022, 591 : 226 - 234
  • [9] Discriminating Information of Modality Contributions Network by Gating Mechanism and Multi-Task Learning
    Zhang, Qiongan
    Shi, Lei
    Liu, Peiyu
    Xu, Liancheng
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [10] Multi-Task Learning for Sentiment Analysis with Hard-Sharing and Task Recognition Mechanisms
    Zhang, Jian
    Yan, Ke
    Mo, Yuchang
    INFORMATION, 2021, 12 (05)