NFLB Dropout: Improve Generalization Ability by Dropping out the Best -A Biologically Inspired Adaptive Dropout Method For Unsupervised Learning

被引:0
|
作者
Yin, Peijie [1 ]
Qi, Lu [2 ]
Xi, Xuanyang [2 ]
Zhang, Bo [1 ]
Qiao, Hong [2 ,3 ]
机构
[1] Chinese Acad Sci, Acad Math & Syst Sci, Inst Appl Math, Beijing 100080, Peoples R China
[2] Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China
[3] CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China
关键词
ATTENTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalization ability is widely acknowledged as one of the most important criteria to evaluate the quality of unsupervised models. The objective of our research is to find a better dropout method to improve the generalization ability of convolutional deep belief network (CDBN), an unsupervised learning model for vision tasks. In this paper, the phenomenon of low feature diversity during the training process is investigated. The attention mechanism of human visual system is more focused on rare events and depresses well-known facts. Inspired by this mechanism, No Feature Left Behind Dropout (NFLB Dropout), an adaptive dropout method is firstly proposed to automatically adjust the dropout rate feature-wisely. In the proposed method, the algorithm drops well-trained features and keeps poorly-trained ones with a high probability during training iterations. In addition, we apply two approximations of the quality of features, which are inspired by theory of saliency and optimization. Compared with the model trained by standard dropout, experiment results show that our NFLB Dropout method improves not only the accuracy but the convergence speed as well.
引用
收藏
页码:1180 / 1186
页数:7
相关论文
empty
未找到相关数据