Text Classification with Transformers and Reformers for Deep Text Data

被引:0
|
作者
Soleymani, Roghayeh [1 ]
Farret, Jeremie [1 ]
机构
[1] Inmind Technol Inc, Montreal, PQ, Canada
关键词
Natural language processing; Text classification; Transformers; Reformers; Trax; Mind in a box;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state of the art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with three different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformer in terms of accuracy and training speed for text classification. However, Reformers allow to train bigger models which cause memory failure for Transformers.
引用
收藏
页码:239 / 243
页数:5
相关论文
共 50 条
  • [21] Training Text-to-Text Transformers with Privacy Guarantees
    Ponomareva, Natalia
    Bastings, Jasmijn
    Vassilvitskii, Sergei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2182 - 2193
  • [22] The Text Classification for Imbalanced Data Sets
    Li, Yanling
    Zhu, Yehang
    Yang, Ping
    ISISE 2008: INTERNATIONAL SYMPOSIUM ON INFORMATION SCIENCE AND ENGINEERING, VOL 2, 2008, : 778 - +
  • [23] Dealing with Data Imbalance in Text Classification
    Padurariu, Cristian
    Breaban, Mihaela Elena
    KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS (KES 2019), 2019, 159 : 736 - 745
  • [24] Training Data Cleaning for Text Classification
    Esuli, Andrea
    Sebastiani, Fabrizio
    ADVANCES IN INFORMATION RETRIEVAL THEORY, 2009, 5766 : 29 - 41
  • [25] A technology of text classification of data mining
    Yang, Bin
    Meng, Zhi-qing
    Xiangtan Daxue Ziran Kexue Xuebao, 2001, 23 (04): : 34 - 37
  • [26] Text Classification for Data Loss Preventionwa
    Hart, Michael
    Manadhata, Pratyusa
    Johnson, Rob
    PRIVACY ENHANCING TECHNOLOGIES, 2011, 6794 : 18 - +
  • [27] A Survey on Data Augmentation for Text Classification
    Bayer, Markus
    Kaufhold, Marc-Andre
    Reuter, Christian
    ACM COMPUTING SURVEYS, 2023, 55 (07)
  • [28] Combination of loss functions for deep text classification
    Hajiabadi, Hamideh
    Molla-Aliod, Diego
    Monsefi, Reza
    Yazdi, Hadi Sadoghi
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (04) : 751 - 761
  • [29] Applications of Deep Learning in News Text Classification
    Zhang, Menghan
    SCIENTIFIC PROGRAMMING, 2021, 2021
  • [30] A Hybrid Deep Learning Model for Text Classification
    Chen, Xianglong
    Ouyang, Chunping
    Liu, Yongbin
    Luo, Lingyun
    Yang, Xiaohua
    2018 14TH INTERNATIONAL CONFERENCE ON SEMANTICS, KNOWLEDGE AND GRIDS (SKG), 2018, : 46 - 52