Task-Agnostic Vision Transformer for Distributed Learning of Image Processing

被引:6
|
作者
Kim, Boah [1 ]
Kim, Jeongsol [1 ]
Ye, Jong Chul [2 ]
机构
[1] Korea Adv Inst Sci & Technol KAIST, Dept Bio & Brain Engn, Daejeon 34141, South Korea
[2] Korea Adv Inst Sci & Technol KAIST, Kim Jaechul Grad Sch Artificial Intelligence AI, Daejeon 34141, South Korea
关键词
Distributed learning; transformer; image processing; task-agnostic learning; QUALITY ASSESSMENT;
D O I
10.1109/TIP.2022.3226892
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, distributed learning approaches have been studied for using data from multiple sources without sharing them, but they are not usually suitable in applications where each client carries out different tasks. Meanwhile, Transformer has been widely explored in computer vision area due to its capability to learn the common representation through global attention. By leveraging the advantages of Transformer, here we present a new distributed learning framework for multiple image processing tasks, allowing clients to learn distinct tasks with their local data. This arises from a disentangled representation of local and non-local features using a task-specific head/tail and a task-agnostic Vision Transformer. Each client learns a translation from its own task to a common representation using the task-specific networks, while the Transformer body on the server learns global attention between the features embedded in the representation. To enable decomposition between the task-specific and common representations, we propose an alternating training strategy between clients and server. Experimental results on distributed learning for various tasks show that our method synergistically improves the performance of each client with its own data.
引用
收藏
页码:203 / 218
页数:16
相关论文
共 50 条
  • [31] Towards a Task-Agnostic Model of Difficulty Estimation for Supervised Learning Tasks
    Laverghetta, Antonio, Jr.
    Mirzakhalov, Jamshidbek
    Licato, John
    AACL-IJCNLP 2020: THE 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2020, : 16 - 23
  • [32] Task-agnostic self-modeling machines
    Kwiatkowski, Robert
    Lipson, Hod
    SCIENCE ROBOTICS, 2019, 4 (26)
  • [33] Task-agnostic representation learning of multimodal twitter data for downstream applications
    Rivas, Ryan
    Paul, Sudipta
    Hristidis, Vagelis
    Papalexakis, Evangelos E.
    Roy-Chowdhury, Amit K.
    JOURNAL OF BIG DATA, 2022, 9 (01)
  • [34] TADA: Task-Agnostic Dialect Adapters for English
    Held, William
    Ziems, Caleb
    Yang, Diyi
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 813 - 824
  • [35] Mimic and Fool: A Task-Agnostic Adversarial Attack
    Chaturvedi, Akshay
    Garain, Utpal
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (04) : 1801 - 1808
  • [36] Towards General Purpose Vision Systems: An End-to-End Task-Agnostic Vision-Language Architecture
    Gupta, Tanmay
    Kamath, Amita
    Kembhavi, Aniruddha
    Hoiem, Derek
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16378 - 16388
  • [37] TAFA: A Task-Agnostic Fingerprinting Algorithm for Neural Networks
    Pan, Xudong
    Zhang, Mi
    Lu, Yifan
    Yang, Min
    COMPUTER SECURITY - ESORICS 2021, PT I, 2021, 12972 : 542 - 562
  • [38] Task-Agnostic Adaptive Activation Scaling Network for LLMs
    Jia, Ni
    Liu, Tong
    Chen, Jiadi
    Zhang, Ying
    Han, Song
    IEEE ACCESS, 2025, 13 : 31774 - 31784
  • [39] Task-Agnostic Structured Pruning of Speech Representation Models
    Wang, Haoyu
    Wang, Siyuan
    Zhang, Wei-Qiang
    Suo, Hongbin
    Wan, Yulong
    INTERSPEECH 2023, 2023, : 231 - 235
  • [40] DJMix: Unsupervised Task-agnostic Image Augmentation for Improving Robustness of Convolutional Neural Networks
    Hataya, Ryuichiro
    Nakayama, Hideki
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,