Zero-Shot Text Recognition Combining Transfer Guide and Bidirectional Cycle Structure GAN

被引:0
|
作者
Zhang G. [1 ]
Long B. [1 ]
Lu F. [1 ]
机构
[1] Institute of Computer Vision, Nanchang Hangkong University, Nanchang
基金
中国国家自然科学基金;
关键词
Bidirectional Cycle Structure; Text Recognition; Transfer Guide; Zero-Shot Learning;
D O I
10.16451/j.cnki.issn1003-6059.202012003
中图分类号
学科分类号
摘要
To improve the recognition accuracy of zero-shot recognition methods based on generative adversarial network(GAN), a zero-shot text recognition method combining transfer guidance and bidirectional cycle structure GAN is proposed. Bidirectional cycle structure GAN is constructed to improve the generation ability of the model, thus the generated pseudo features are closer to the real features of the input. The concept of transfer guided learning is introduced, and the model is trained by the transfer text instead of the seen text to improve the recognition accuracy of the unseen text. By adding an effective regularization term, the generator generates diverse results during the training process, and thus the stability of the generated model is improved. The experiment shows that the proposed method improves the accuracy of zero-shot recognition task with high generalization performance and it can be easily extended to other applications. © 2020, Science Press. All right reserved.
引用
收藏
页码:1083 / 1096
页数:13
相关论文
共 27 条
  • [1] LAROCHELLE H, ERHAN D, BENGIO Y., Zero-Data Learning of New Tasks, Proc of the 23rd National Conference on Artificial Intelligence, pp. 646-651, (2008)
  • [2] LAMPERT C H, NICKISCH H, HARMELING S., Attribute-Based Classification for Zero-Shot Visual Object Categorization, IEEE Transactions on Pattern Analysis and Machine Intelligence, 36, 3, pp. 453-465, (2014)
  • [3] LIU M X, ZHANG D Q, CHEN S C., Attribute Relation Learning for Zero-Shot Classification, Neurocomputing, 139, pp. 34-46, (2014)
  • [4] WU F, WANG K., Deep Zero-Shot Learning Based on Attribute Balancing Regularization, Computer Applications and Software, 35, 10, pp. 165-170, (2018)
  • [5] ROMERA-PAREES B, TORR P H S., An Embarrassingly Simple Approach to Zero-Shot Learning, Visual Attributes, pp. 11-30, (2017)
  • [6] CHANGPINYO S, CHAO W L, GONG B Q, Et al., Synthesized Classifiers for Zero-Shot Learning, Proc of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5327-5336, (2016)
  • [7] ZHANG L N, ZUO X, LIU W J., Research and Development on Zero-Shot Learning, Acta Automatica Sinica, 46, 1, pp. 1-23, (2020)
  • [8] ELHOSEINY M, SALEH B, ELGAMMAL A., Write a Classifier: Zero-Shot Learning Using Purely Textual Descriptions, Proc of the IEEE International Conference on Computer Vision, pp. 2584-2591, (2013)
  • [9] QIAO R Z, LIU L Q, SHEN C H, Et al., Less Is More: Zero-Shot Learning from Online Textual Documents with Noise Suppression, Proc of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2249-2257, (2016)
  • [10] JI Z, SUN T, YU Y L., Transductive Discriminative Dictionary Learning Approach for Zero-Shot Classification, Journal of Software, 28, 11, pp. 2961-2970, (2017)