Automatic Text Summarization Based on Transformer and Switchable Normalization

被引:1
|
作者
Luo, Tao [1 ,2 ]
Guo, Kun [1 ,2 ,3 ]
Guo, Hong [1 ]
机构
[1] Fuzhou Univ, Coll Math & Comp Sci, Fuzhou, Peoples R China
[2] Fuzhou Univ, Fujian Prov Key Lab Network Comp & Intelligent In, Fuzhou, Peoples R China
[3] Fuzhou Univ, Key Lab Spatial Data Min & Informat Sharing, Minist Educ, Fuzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
text summarization; Encoder-Decoder model; attention; transformer; switchable normalization;
D O I
10.1109/ISPA-BDCloud-SustainCom-SocialCom48970.2019.00236
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the development of text summarization research, the methods based on RNN with the Encoder-Decoder model gradually become the mainstream. However, RNN tends to forget previous context information and leads to the lack of original information. That will reduce the accuracy of the generated summarizations. The Transformer model uses the self-attention mechanism to encode and decode historical information so that it can achieve better performance than RNN in learning context information. In this paper, a text summarization model based on transformer and switchable normalization is proposed. The accuracy of the model is improved by optimizing the normalization layer. Compared with other models, the new model has a great advantage in understanding words' semantics and associations. Experimental results in English Gigaword dataset show that the proposed model can achieve high ROUGE values and make the summarization more readable.
引用
收藏
页码:1606 / 1611
页数:6
相关论文
共 50 条
  • [41] A Novel Model of Generative Automatic Text Summarization Based on BART
    Wang, Yahui
    Chang, Qingxia
    Meng, Xuelei
    IAENG International Journal of Computer Science, 2025, 52 (02) : 507 - 514
  • [42] EdgeSumm: Graph-based framework for automatic text summarization
    El-Kassas, Wafaa S.
    Salama, Cherif R.
    Rafea, Ahmed A.
    Mohamed, Hoda K.
    INFORMATION PROCESSING & MANAGEMENT, 2020, 57 (06)
  • [43] Sentence reduction for automatic text summarization
    Jing, HY
    6TH APPLIED NATURAL LANGUAGE PROCESSING CONFERENCE/1ST MEETING OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE CONFERENCE AND PROCEEDINGS OF THE ANLP-NAACL 2000 STUDENT RESEARCH WORKSHOP, 2000, : 310 - 315
  • [44] Practical approach to automatic text summarization
    Hynek, J
    Jezek, K
    FROM INFORMATION TO KNOWLEDGE, 2003, : 378 - 388
  • [45] An Automatic Text Summarization: A Systematic Review
    Patel, Vishwa
    Tabrizi, Nasseh
    COMPUTACION Y SISTEMAS, 2022, 26 (03): : 1259 - 1267
  • [46] Psychological Features for Automatic Text Summarization
    Losada, David E.
    Parapar, Javier
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2017, 25 : 129 - 149
  • [47] Statistical Models to Automatic Text Summarization
    Pham Trong Nguyen
    Co Ton Minh Dang
    FUTURE DATA AND SECURITY ENGINEERING, FDSE 2018, 2018, 11251 : 486 - 498
  • [48] Intelligent model for automatic text summarization
    Binwahlan, M.S.
    Salim, N.
    Suanmali, L.
    Information Technology Journal, 2009, 8 (08) : 1249 - 1255
  • [49] AUTOMATIC TEXT SUMMARIZATION FOR INDIAN LANGUAGES
    Kumar, Jeetendra
    Shekhar, Shashi
    Gupta, Rashmi
    EVERYMANS SCIENCE, 2022, 57 (01):
  • [50] Advances in automatic text summarization.
    Saggion, H
    KNOWLEDGE ORGANIZATION, 2000, 27 (03): : 178 - 180