Metric-based Regularization and Temporal Ensemble for Multi-task Learning using Heterogeneous Unsupervised Tasks

被引:2
|
作者
Kim, Dae Ha [1 ]
Lee, Seung Hyun [1 ]
Song, Byung Cheol [1 ]
机构
[1] Inha Univ, 100 Inha Ro, Incheon 22212, South Korea
基金
新加坡国家研究基金会;
关键词
D O I
10.1109/ICCVW.2019.00352
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the ways to improve the performance of a target task is to learn the transfer of abundant knowledge of a pre-trained network. However, learning of the pre-trained network requires high computation capability and large-scale labeled datasets. To mitigate the burden of large-scale labeling, learning in un/self-supervised manner can be a solution. In addition, using un-supervised multi-task learning, a generalized feature representation can be learned. However, un-supervised multi-task learning can be biased to a specific task. To overcome this problem, we propose the metric-based regularization term and temporal task ensemble (TTE) for multi-task learning. Since these two techniques prevent the entire network from learning in a state deviated to a specific task, it is possible to learn a generalized feature representation that appropriately reflects the characteristics of each task without biasing. Experimental results for three target tasks such as classification, object detection and embedding clustering prove that the TTE-based multi-task framework is more effective than the state-of-the-art (SOTA) method in improving the performance of a target task.
引用
收藏
页码:2903 / 2912
页数:10
相关论文
共 50 条
  • [21] Geometry preserving multi-task metric learning
    Peipei Yang
    Kaizhu Huang
    Cheng-Lin Liu
    Machine Learning, 2013, 92 : 133 - 175
  • [22] Heterogeneous multi-task feature learning with mixed l2,1 regularization
    Zhong, Yuan
    Xu, Wei
    Gao, Xin
    MACHINE LEARNING, 2024, 113 (02) : 891 - 932
  • [23] Geometry preserving multi-task metric learning
    Yang, Peipei
    Huang, Kaizhu
    Liu, Cheng-Lin
    MACHINE LEARNING, 2013, 92 (01) : 133 - 175
  • [24] Multi-Task Ensemble Learning for Affect Recognition
    Gjoreski, Martin
    Lustrek, Mitja
    Gams, Matjaz
    PROCEEDINGS OF THE 2018 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2018 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS (UBICOMP/ISWC'18 ADJUNCT), 2018, : 553 - 558
  • [25] Unsupervised domain adaptation: A multi-task learning-based method
    Zhang, Jing
    Li, Wanqing
    Ogunbona, Philip
    KNOWLEDGE-BASED SYSTEMS, 2019, 186
  • [26] TASK AWARE MULTI-TASK LEARNING FOR SPEECH TO TEXT TASKS
    Indurthi, Sathish
    Zaidi, Mohd Abbas
    Lakumarapu, Nikhil Kumar
    Lee, Beomseok
    Han, Hyojung
    Ahn, Seokchan
    Kim, Sangha
    Kim, Chanwoo
    Hwang, Inchul
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7723 - 7727
  • [27] Using Multi-task and Transfer Learning to Solve Working Memory Tasks
    Jayram, T. S.
    Kornuta, Tomasz
    McAvoy, Ryan L.
    Ozcan, Ahmet S.
    2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 256 - 263
  • [28] Drug sensitivity prediction framework using ensemble and multi-task learning
    Sharma, Aman
    Rani, Rinkle
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (06) : 1231 - 1240
  • [29] Drug sensitivity prediction framework using ensemble and multi-task learning
    Aman Sharma
    Rinkle Rani
    International Journal of Machine Learning and Cybernetics, 2020, 11 : 1231 - 1240
  • [30] Unsupervised Multi-task Learning with Hierarchical Data Structure
    Cao, Wenming
    Qian, Sheng
    Wu, Si
    Wong, Hau-San
    PATTERN RECOGNITION, 2019, 86 : 248 - 264