Minimax Lower Bounds for Transfer Learning with Linear and One-hidden Layer Neural Networks

被引:0
|
作者
Kalan, Seyed Mohammadreza Mousavi [1 ]
Fabian, Zalan [1 ]
Avestimehr, Salman [1 ]
Soltanolkotabi, Mahdi [1 ]
机构
[1] Univ Southern Calif, Ming Hsieh Dept Elect Engn, Los Angeles, CA 90089 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transfer learning has emerged as a powerful technique for improving the performance of machine learning models on new domains where labeled training data may be scarce. In this approach a model trained for a source task, where plenty of labeled training data is available, is used as a starting point for training a model on a related target task with only few labeled training data. Despite recent empirical success of transfer learning approaches, the benefits and fundamental limits of transfer learning are poorly understood. In this paper we develop a statistical minimax framework to characterize the fundamental limits of transfer learning in the context of regression with linear and one-hidden layer neural network models. Specifically, we derive a lower-bound for the target generalization error achievable by any algorithm as a function of the number of labeled source and target data as well as appropriate notions of similarity between the source and target tasks. Our lowerbound provides new insights into the benefits and limitations of transfer learning. We further corroborate our theoretical finding with various experiments.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Analysis of one-hidden-layer Neural Networks via the Resolvent Method
    Piccolo, Vanessa
    Schroder, Dominik
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] Distributed Parameter Estimation in Randomized One-hidden-layer Neural Networks
    Wang, Yinsong
    Shahrampour, Shahin
    2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 737 - 742
  • [34] Episodic Reinforcement Learning in Finite MDPs: Minimax Lower Bounds Revisited
    Domingues, Omar Darwiche
    Menard, Pierre
    Kaufmann, Emilie
    Valko, Michal
    ALGORITHMIC LEARNING THEORY, VOL 132, 2021, 132
  • [35] Transfer bounds for linear feature learning
    Maurer, Andreas
    MACHINE LEARNING, 2009, 75 (03) : 327 - 350
  • [36] Transfer bounds for linear feature learning
    Andreas Maurer
    Machine Learning, 2009, 75 : 327 - 350
  • [37] Lower bounds for approximation by MLP neural networks
    Maiorov, V
    Pinkus, A
    NEUROCOMPUTING, 1999, 25 (1-3) : 81 - 91
  • [38] Layer Removal for Transfer Learning with Deep Convolutional Neural Networks
    Zhi, Weiming
    Chen, Zhenghao
    Yueng, Henry Wing Fung
    Lu, Zhicheng
    Zandavi, Seid Miad
    Chung, Yuk Ying
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 460 - 469
  • [39] A novel learning algorithm of single-hidden-layer feedforward neural networks
    Dong-Mei Pu
    Da-Qi Gao
    Tong Ruan
    Yu-Bo Yuan
    Neural Computing and Applications, 2017, 28 : 719 - 726
  • [40] A fast constructive learning algorithm for single-hidden-layer neural networks
    Zhu, QY
    Huang, GB
    Siew, CK
    2004 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION, VOLS 1-3, 2004, : 1907 - 1911