Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition

被引:2
|
作者
Lee, Sang-woo [1 ]
Lee, Ryong [2 ]
Seo, Min-seok [1 ]
Park, Jong-chan [3 ]
Noh, Hyeon-cheol [1 ]
Ju, Jin-gi [1 ]
Jang, Rae-young [2 ]
Lee, Gun-woo [2 ]
Choi, Myung-seok [2 ]
Choi, Dong-geol [1 ]
机构
[1] Hanbat Natl Univ, Dept Informat & Commun Engn, Daejeon 34014, South Korea
[2] Korea Inst Sci & Technol Informat KISTI, Dept Machine Learning Data Res, Daejeon 34141, South Korea
[3] Lunit Inc, Seoul 06241, South Korea
关键词
deep learning; multi-task learning; convolutional neural network; TIME SEMANTIC SEGMENTATION; NETWORK; MODEL;
D O I
10.3390/electronics10212691
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-task learning is a computationally efficient method to solve multiple tasks in one multi-task model, instead of multiple single-task models. MTL is expected to learn both diverse and shareable visual features from multiple datasets. However, MTL performances usually do not outperform single-task learning. Recent MTL methods tend to use heavy task-specific heads with large overheads to generate task-specific features. In this work, we (1) validate the efficacy of MTL in low-data conditions with early-exit architectures, and (2) propose a simple feature filtering module with minimal overheads to generate task-specific features. We assume that, in low-data conditions, the model cannot learn useful low-level features due to the limited amount of data. We empirically show that MTL can significantly improve performances in all tasks under low-data conditions. We further optimize the early-exit architecture by a sweep search on the optimal feature for each task. Furthermore, we propose a feature filtering module that selects features for each task. Using the optimized early-exit architecture with the feature filtering module, we improve the 15.937% in ImageNet and 4.847% in Places365 under the low-data condition where only 5% of the original datasets are available. Our method is empirically validated in various backbones and various MTL settings.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    Wu, Wei
    Li, Hang
    Hu, Yun-Hua
    Jin, Rong
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2012, 27 (06) : 1289 - 1301
  • [2] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    武威
    李航
    胡云华
    金榕
    Journal of Computer Science & Technology, 2012, 27 (06) : 1289 - 1301
  • [3] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    Wei Wu
    Hang Li
    Yun-Hua Hu
    Rong Jin
    Journal of Computer Science and Technology, 2012, 27 : 1289 - 1301
  • [4] Projected Task-Specific Layers for Multi-Task Reinforcement Learning
    Roberts, Josselin Somerville
    Di, Julia
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 2887 - 2893
  • [5] DEEP MULTI-TASK AND TASK-SPECIFIC FEATURE LEARNING NETWORK FOR ROBUST SHAPE PRESERVED ORGAN SEGMENTATION
    Tan, Chaowei
    Zhao, Liang
    Yan, Zhennan
    Li, Kang
    Metaxas, Dimitris
    Zhan, Yiqiang
    2018 IEEE 15TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2018), 2018, : 1221 - 1224
  • [6] Multi-task feature selection with sparse regularization to extract common and task-specific features
    Zhang, Jiashuai
    Miao, Jianyu
    Zhao, Kun
    Tian, Yingjie
    NEUROCOMPUTING, 2019, 340 : 76 - 89
  • [7] T3S: Improving Multi-Task Reinforcement Learning with Task-Specific Feature Selector and Scheduler
    Yu, Yuanqiang
    Yang, Tianpei
    Lv, Yongliang
    Zheng, Yan
    Hao, Jianye
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [8] SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling
    Lu, Peng
    Bai, Ting
    Langlais, Philippe
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2396 - 2406
  • [9] TSMV: TASK-SPECIFIC MULTI-VIEW FEATURE LEARNING
    Zhang, Chengyue
    Han, Yahong
    8TH INTERNATIONAL CONFERENCE ON INTERNET MULTIMEDIA COMPUTING AND SERVICE (ICIMCS2016), 2016, : 39 - 42
  • [10] Learning Task Relational Structure for Multi-Task Feature Learning
    Wang, De
    Nie, Feiping
    Huang, Heng
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1239 - 1244