Optimization of the Abstract Text Summarization Model Based on Multi-Task Learning

被引:0
|
作者
Yao, Ben [1 ]
Ding, Gejian [1 ]
机构
[1] Zhejiang Normal Univ, Sch Comp Sci & Technol, Jinhua, Zhejiang, Peoples R China
关键词
Abstract text summarization; BART; Multi-task learning; Factual consistency;
D O I
10.1145/3650400.3650469
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic text summtarization technology is a crucial component in the field of natural language processing, utilized to address the demands of processing extensive textual data by effectively extracting key information to enhance task efficiency. With the rise of pretrained language models, abstract text summarization has progressively become mainstream, producing fluent srummaries that encapsulate core content. Nonetheless, abstract text summarization unavoidably faces problems of inconsistency with the original text. This paper introduces a sequence tagging task to achieve multi-task learning for abstract text summarization models. In this sequence tagging task, we meticulously designed annotated datasets at both entity and sentence levels based on an analysis of the XSum dataset, aiming to enhance the factual consistency of generated summaries. Experimental results demonstrate that the optimized BART model yields favorable performance in terms of ROUGE and FactCC metrics.
引用
收藏
页码:424 / 428
页数:5
相关论文
共 50 条
  • [21] MCapsNet: Capsule Network for Text with Multi-Task Learning
    Xiao, Liqiang
    Zhang, Honglun
    Chen, Wenqing
    Wang, Yongkun
    Jin, Yaohui
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4565 - 4574
  • [22] Adaptive multi-task learning for speech to text translation
    Feng, Xin
    Zhao, Yue
    Zong, Wei
    Xu, Xiaona
    EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, 2024, 2024 (01):
  • [23] Multi-Task Learning as Multi-Objective Optimization
    Sener, Ozan
    Koltun, Vladlen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [24] Pretraining Financial Language Model with Multi-Task Learning for Financial Text Mining
    Liu Z.
    Liu C.
    Lin W.
    Zhao J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2021, 58 (08): : 1761 - 1772
  • [25] A Multi-Task Learning Approach to Sarcasm Detection (Student Abstract)
    Savini, Edoardo
    Caragea, Cornelia
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13907 - 13908
  • [26] Unsupervised Multi-task Learning Dialogue Management Extended Abstract
    Sushravya, G. M.
    Sengupta, Shubhashis
    PROCEEDINGS OF THE 6TH ACM IKDD CODS AND 24TH COMAD, 2019, : 196 - 202
  • [27] A multi-task learning model with reinforcement optimization for ASD comorbidity discrimination
    Dong, Heyou
    Chen, Dan
    Chen, Yukang
    Tang, Yunbo
    Yin, Dingze
    Li, Xiaoli
    COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 243
  • [28] LiSum: Open Source Software License Summarization with Multi-Task Learning
    Li, Linyu
    Xu, Sihan
    Liu, Yang
    Gao, Ya
    Cai, Xiangrui
    Wu, Jiarun
    Song, Wenli
    Liu, Zheli
    2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE, 2023, : 787 - 799
  • [29] An Entity Disambiguation Method for Chinese Short Text Based on Multi-task Learning
    Lin, Yun
    Duan, Zongtao
    Cao, Jianrong
    Song, Jiawei
    2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 315 - 320
  • [30] MTBERT-Attention: An Explainable BERT Model based on Multi-Task Learning for Cognitive Text Classification
    Sebbaq, Hanane
    El Faddouli, Nour-Eddine
    SCIENTIFIC AFRICAN, 2023, 21