Meta-Learning with Neural Bandit Scheduler

被引:0
|
作者
Qi, Yunzhe [1 ]
Ban, Yikun [1 ]
Wei, Tianxin [1 ]
Zou, Jiaru [1 ]
Yao, Huaxiu [2 ]
He, Jingrui [1 ]
机构
[1] Univ Illinois, Champaign, IL 61820 USA
[2] Univ North Carolina Chapel Hill, Chapel Hill, NC USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023) | 2023年
基金
美国食品与农业研究所; 美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Meta-learning has been proven an effective learning paradigm for training machine learning models with good generalization ability. Apart from the common practice of uniformly sampling the meta-training tasks, existing methods working on task scheduling strategies are mainly based on pre-defined sampling protocols or the assumed task-model correlations, and greedily make scheduling decisions, which can lead to sub-optimal performance bottlenecks of the meta-model. In this paper, we propose a novel task scheduling framework under Contextual Bandits settings, named BASS, which directly optimizes the task scheduling strategy based on the status of the meta-model. By balancing the exploitation and exploration in meta-learning task scheduling, BASS can help tackle the challenge of limited knowledge about the task distribution during the early stage of meta-training, while simultaneously exploring potential benefits for forthcoming meta-training iterations through an adaptive exploration strategy. Theoretical analysis and extensive experiments are presented to show the effectiveness of our proposed framework.
引用
收藏
页数:35
相关论文
共 50 条
  • [1] Meta-Learning Adversarial Bandit Algorithms
    Khodak, Mikhail
    Osadchiy, Ilya
    Harris, Keegan
    Balcan, Maria-Florina
    Levy, Kfir Y.
    Meir, Ron
    Wu, Zhiwei Steven
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Meta-learning with an Adaptive Task Scheduler
    Yao, Huaxiu
    Wang, Yu
    Wei, Ying
    Zhao, Peilin
    Mahdavi, Mehrdad
    Lian, Defu
    Finn, Chelsea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Meta-Learning in Neural Networks: A Survey
    Hospedales, Timothy
    Antoniou, Antreas
    Micaelli, Paul
    Storkey, Amos
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 5149 - 5169
  • [4] Meta-Learning Neural Bloom Filters
    Rae, Jack W.
    Bartunov, Sergey
    Lillicrap, Timothy P.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] Meta-learning approach to neural network optimization
    Kordik, Pavel
    Koutnik, Jan
    Drchal, Jan
    Kovarik, Oleg
    Cepek, Miroslav
    Snorek, Miroslav
    NEURAL NETWORKS, 2010, 23 (04) : 568 - 582
  • [6] Meta-learning Sparse Implicit Neural Representations
    Lee, Jaeho
    Tack, Jihoon
    Lee, Namhoon
    Shin, Jinwoo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Trapezoidal Step Scheduler for Model-Agnostic Meta-Learning in Medical Imaging
    Voon, Wingates
    Hum, Yan Chai
    Tee, Yee Kai
    Yap, Wun-She
    Lai, Khin Wee
    Nisar, Humaira
    Mokayed, Hamam
    PATTERN RECOGNITION, 2025, 161
  • [8] Meta-learning Hyperparameter Performance Prediction with Neural Processes
    Wei, Ying
    Zhao, Peilin
    Huang, Junzhou
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] Dual Enhanced Meta-Learning With Adaptive Task Scheduler for Cold-Start Recommendation
    He, Dongxiao
    Cui, Jiaqi
    Wang, Xiaobao
    Song, Guojie
    Huang, Yuxiao
    Wu, Lingfei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (04) : 1728 - 1741
  • [10] Employment of neural network and rough set in meta-learning
    Salama, Mostafa A.
    Hassanien, Aboul Ella
    Revett, Kenneth
    MEMETIC COMPUTING, 2013, 5 (03) : 165 - 177