Population-Based Hyperparameter Tuning With Multitask Collaboration

被引:5
|
作者
Li, Wendi [1 ]
Wang, Ting [2 ]
Ng, Wing W. Y. [1 ]
机构
[1] South China Univ Technol, Guangdong Prov Key Lab Computat Intelligence & Cy, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] South China Univ Technol, Guangzhou Peoples Hosp 1, Sch Med, Dept Radiol, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Task analysis; Tuning; Collaboration; Statistics; Sociology; Optimization; Deep learning; Deep neural network; hyperparameter (HP) tuning; multitask learning; NEURAL-NETWORKS; OPTIMIZATION; SELECTION; SEARCH;
D O I
10.1109/TNNLS.2021.3130896
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Population-based optimization methods are widely used for hyperparameter (HP) tuning for a given specific task. In this work, we propose the population-based hyperparameter tuning with multitask collaboration (PHTMC), which is a general multitask collaborative framework with parallel and sequential phases for population-based HP tuning methods. In the parallel HP tuning phase, a shared population for all tasks is kept and the intertask relatedness is considered to both yield a better generalization ability and avoid data bias to a single task. In the sequential HP tuning phase, a surrogate model is built for each new-added task so that the metainformation from the existing tasks can be extracted and used to help the initialization for the new task. Experimental results show significant improvements in generalization abilities yielded by neural networks trained using the PHTMC and better performances achieved by multitask metalearning. Moreover, a visualization of the solution distribution and the autoencoder's reconstruction of both the PHTMC and a single-task population-based HP tuning method is compared to analyze the property with the multitask collaboration.
引用
收藏
页码:5719 / 5731
页数:13
相关论文
共 50 条
  • [41] Efficient Population Based Hyperparameter Scheduling for Medical Image Segmentation
    He, Yufan
    Yang, Dong
    Myronenko, Andriy
    Xu, Daguang
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT V, 2022, 13435 : 560 - 569
  • [42] Automatic hyperparameter tuning for support vector machines
    Anguita, D
    Ridella, S
    Rivieccio, F
    Zunino, R
    ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 1345 - 1350
  • [43] Automatic Hyperparameter Tuning in Sparse Matrix Factorization
    Kawasumi, Ryota
    Takeda, Koujin
    NEURAL COMPUTATION, 2023, 35 (06) : 1086 - 1099
  • [44] Hyperparameter self-tuning for data streams
    Veloso, Bruno
    Gama, Joao
    Malheiro, Benedita
    Vinagre, Joao
    INFORMATION FUSION, 2021, 76 : 75 - 86
  • [45] The Statistical Cost of Robust Kernel Hyperparameter Tuning
    Meyer, Raphael A.
    Musco, Christopher
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [46] Analysis of Hyperparameter Tuning in Neural Style Transfer
    Khandelwal, Siddhant
    Pandey, Kavita
    Rana, Sarthak
    Kaushik, Prashant
    2018 FIFTH INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND GRID COMPUTING (IEEE PDGC), 2018, : 36 - 41
  • [47] Sequential Model-free Hyperparameter Tuning
    Wistuba, Martin
    Schilling, Nicolas
    Schmidt-Thieme, Lars
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 1033 - 1038
  • [48] Population-Based Testing: Let's Have Population-Based Discussion
    Bonhomme, Natasha
    GENETIC TESTING AND MOLECULAR BIOMARKERS, 2013, 17 (04) : 265 - 266
  • [49] Adaptive Hyperparameter Tuning Within Neural Network-Based Efficient Global Optimization
    Jeong, Taeho
    Koratikere, Pavankumar
    Leifsson, Leifur
    Koziel, Slawomir
    Pietrenko-Dabrowska, Anna
    COMPUTATIONAL SCIENCE, ICCS 2024, PT V, 2024, 14836 : 74 - 89
  • [50] Hyperparameter elegance: fine-tuning text analysis with enhanced genetic algorithm hyperparameter landscape
    Tripathy, Gyananjaya
    Sharaff, Aakanksha
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (11) : 6761 - 6783