Accelerating Active Learning with Transfer Learning

被引:18
|
作者
Kale, David [1 ]
Liu, Yan [1 ]
机构
[1] Univ So Calif, Dept Comp Sci, Los Angeles, CA 90007 USA
关键词
D O I
10.1109/ICDM.2013.160
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning, transfer learning, and related techniques are unified by a core theme: efficient and effective use of available data. Active learning offers scalable solutions for building effective supervised learning models while minimizing annotation effort. Transfer learning utilizes existing labeled data from one task to help learn related tasks for which limited labeled data are available. There has been limited research, however, on how to combine these two techniques. In this paper, we present a simple and principled transfer active learning framework that leverages pre-existing labeled data from related tasks to improve the performance of an active learner. We derive an intuitive bound on the generalization error for the classifiers learned by this algorithm that provides insight into the algorithm's behavior and the problem in general. We provide experimental results using several well-known transfer learning data sets that confirm our theoretical analysis. What is more, our results suggest that this approach represents a promising solution to a specific weakness of active learning algorithms: cold starts with zero labeled data.
引用
收藏
页码:1085 / 1090
页数:6
相关论文
共 50 条
  • [11] Batch active learning for accelerating the development of interatomic potentials
    Wilson, Nathan
    Willhelm, Daniel
    Qian, Xiaoning
    Arroyave, Raymundo
    Qian, Xiaofeng
    COMPUTATIONAL MATERIALS SCIENCE, 2022, 208
  • [12] Active Transfer Learning and Selective Instance Transfer with Active Learning for Motor Imagery based BCI
    Hossain, Ibrahim
    Khosravi, Abbas
    Nahavandhi, Saeid
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4048 - 4055
  • [13] Accelerating Hydraulic Fracture Imaging by Deep Transfer Learning
    Zhang, Runren
    Sun, Qingtao
    Mao, Yiqian
    Cui, Liangze
    Jia, Yongze
    Huang, Wei-Feng
    Ahmadian, Mohsen
    Liu, Qing Huo
    IEEE TRANSACTIONS ON ANTENNAS AND PROPAGATION, 2022, 70 (07) : 6117 - 6121
  • [14] Accelerating crystal structure prediction by machine-learning interatomic potentials with active learning
    Podryabinkin, Evgeny, V
    Tikhonov, Evgeny, V
    Shapeev, Alexander, V
    Oganov, Artem R.
    PHYSICAL REVIEW B, 2019, 99 (06)
  • [15] Accelerating deep learning inference via layer truncation and transfer learning for fingerprint classification
    Mukoya, Esther
    Rimiru, Richard
    Kimwele, Michael
    Gakii, Consolata
    Mugambi, Grace
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (08):
  • [16] Efficient Argument Structure Extraction with Transfer Learning and Active Learning
    Hua, Xinyu
    Wang, Lu
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 423 - 437
  • [17] Active Learning Based on Transfer Learning Techniques for Text Classification
    Onita, Daniela
    IEEE ACCESS, 2023, 11 : 28751 - 28761
  • [18] Transfer active learning by querying committee
    Hao SHAO
    Feng TAO
    Rui XU
    JournalofZhejiangUniversity-ScienceC(Computers&Electronics), 2014, 15 (02) : 107 - 118
  • [19] Active Selection Transfer Learning Algorithm
    Weifei Wu
    Yanhui Zhang
    Fuyijin Xing
    Neural Processing Letters, 2023, 55 : 10093 - 10116
  • [20] Transfer active learning by querying committee
    Hao Shao
    Feng Tao
    Rui Xu
    Journal of Zhejiang University SCIENCE C, 2014, 15 : 107 - 118