Population-Based Hyperparameter Tuning With Multitask Collaboration

被引:5
|
作者
Li, Wendi [1 ]
Wang, Ting [2 ]
Ng, Wing W. Y. [1 ]
机构
[1] South China Univ Technol, Guangdong Prov Key Lab Computat Intelligence & Cy, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] South China Univ Technol, Guangzhou Peoples Hosp 1, Sch Med, Dept Radiol, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Task analysis; Tuning; Collaboration; Statistics; Sociology; Optimization; Deep learning; Deep neural network; hyperparameter (HP) tuning; multitask learning; NEURAL-NETWORKS; OPTIMIZATION; SELECTION; SEARCH;
D O I
10.1109/TNNLS.2021.3130896
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Population-based optimization methods are widely used for hyperparameter (HP) tuning for a given specific task. In this work, we propose the population-based hyperparameter tuning with multitask collaboration (PHTMC), which is a general multitask collaborative framework with parallel and sequential phases for population-based HP tuning methods. In the parallel HP tuning phase, a shared population for all tasks is kept and the intertask relatedness is considered to both yield a better generalization ability and avoid data bias to a single task. In the sequential HP tuning phase, a surrogate model is built for each new-added task so that the metainformation from the existing tasks can be extracted and used to help the initialization for the new task. Experimental results show significant improvements in generalization abilities yielded by neural networks trained using the PHTMC and better performances achieved by multitask metalearning. Moreover, a visualization of the solution distribution and the autoencoder's reconstruction of both the PHTMC and a single-task population-based HP tuning method is compared to analyze the property with the multitask collaboration.
引用
收藏
页码:5719 / 5731
页数:13
相关论文
共 50 条
  • [21] Maximal Margin Multi-Classifier based on SVM Hyperparameter Tuning
    Mehra, Neha
    Gupta, Surendra
    2015 INTERNATIONAL CONFERENCE ON COMPUTER, COMMUNICATION AND CONTROL (IC4), 2015,
  • [22] Anomaly Detection of Storage Battery Based on Isolation Forest and Hyperparameter Tuning
    Lee, Chun-Hsiang
    Lu, Xu
    Lin, Xiunao
    Tao, Hongfeng
    Xue, Yaolei
    Wu, Chao
    2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 229 - 233
  • [23] Multi-objective machine training based on Bayesian hyperparameter tuning
    Zufiria, Pedro J.
    Borrajo, Carlos
    Taibo, Miguel
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [25] A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning
    Kwon, Joon
    Lecue, Guillaume
    Lerasle, Matthieu
    ELECTRONIC JOURNAL OF STATISTICS, 2021, 15 (01): : 1202 - 1227
  • [26] Fast Hyperparameter Tuning for Ising Machines
    Parizy, Matthieu
    Kakuko, Norihiro
    Togawa, Nozomu
    2023 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, ICCE, 2023,
  • [27] Refining the ONCE Benchmark With Hyperparameter Tuning
    Golyadkin, Maksim
    Gambashidze, Alexander
    Nurgaliev, Ildar
    Makarov, Ilya
    IEEE Access, 2024, 12 : 3805 - 3814
  • [28] Hyperparameter Tuning in Offline Reinforcement Learning
    Tittaferrante, Andrew
    Yassine, Abdulsalam
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 585 - 590
  • [29] Population-Based Health Requires Population-Based Change
    Elliott, Daniel J.
    Weintraub, William S.
    JOURNAL OF PEDIATRICS, 2011, 158 (02): : 181 - 184
  • [30] Efficient Deep Learning Hyperparameter Tuning using Cloud Infrastructure Intelligent Distributed Hyperparameter tuning with Bayesian Optimization in the Cloud
    Ranjit, Mercy Prasanna
    Ganapathy, Gopinath
    Sridhar, Kalaivani
    Arumugham, Vikram
    2019 IEEE 12TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING (IEEE CLOUD 2019), 2019, : 520 - 522