Accelerating Active Learning with Transfer Learning

被引:18
|
作者
Kale, David [1 ]
Liu, Yan [1 ]
机构
[1] Univ So Calif, Dept Comp Sci, Los Angeles, CA 90007 USA
关键词
D O I
10.1109/ICDM.2013.160
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning, transfer learning, and related techniques are unified by a core theme: efficient and effective use of available data. Active learning offers scalable solutions for building effective supervised learning models while minimizing annotation effort. Transfer learning utilizes existing labeled data from one task to help learn related tasks for which limited labeled data are available. There has been limited research, however, on how to combine these two techniques. In this paper, we present a simple and principled transfer active learning framework that leverages pre-existing labeled data from related tasks to improve the performance of an active learner. We derive an intuitive bound on the generalization error for the classifiers learned by this algorithm that provides insight into the algorithm's behavior and the problem in general. We provide experimental results using several well-known transfer learning data sets that confirm our theoretical analysis. What is more, our results suggest that this approach represents a promising solution to a specific weakness of active learning algorithms: cold starts with zero labeled data.
引用
收藏
页码:1085 / 1090
页数:6
相关论文
共 50 条
  • [41] Query by diverse committee in transfer active learning
    Shao, Hao
    FRONTIERS OF COMPUTER SCIENCE, 2019, 13 (02) : 280 - 291
  • [42] Query by diverse committee in transfer active learning
    Hao Shao
    Frontiers of Computer Science, 2019, 13 : 280 - 291
  • [43] Deep Active Transfer Learning for Image Recognition
    Singh, Ankita
    Chakraborty, Shayok
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [44] Active Transfer Learning under Model Shift
    Wang, Xuezhi
    Huang, Tzu-Kuo
    Schneider, Jeff
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1305 - 1313
  • [45] Active Learning for Crowdsourcing Using Knowledge Transfer
    Fang, Meng
    Yin, Jie
    Tao, Dacheng
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1809 - 1815
  • [46] Knowledge Transfer for Active Learning in Textual Anonymisation
    Garcia-Sardina, Laura
    Serras, Manex
    del Pozo, Arantza
    STATISTICAL LANGUAGE AND SPEECH PROCESSING, SLSP 2018, 2018, 11171 : 155 - 166
  • [47] Fuzzy Transfer Learning Using an Infinite Gaussian Mixture Model and Active Learning
    Zuo, Hua
    Lu, Jie
    Zhang, Guangquan
    Liu, Feng
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2019, 27 (02) : 291 - 303
  • [48] Active-Learning-Incorporated Deep Transfer Learning for Hyperspectral Image Classification
    Lin, Jianzhe
    Zhao, Liang
    Li, Shuying
    Ward, Rabab
    Wang, Z. Jane
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2018, 11 (11) : 4048 - 4062
  • [49] A Transfer Learning-Based Active Learning Framework for Brain Tumor Classification
    Hao, Ruqian
    Namdar, Khashayar
    Liu, Lin
    Khalvati, Farzad
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2021, 4
  • [50] Active Learning for Effectively Fine-Tuning Transfer Learning to Downstream Task
    Abul Bashar, Md
    Nayak, Richi
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2021, 12 (02)