Optimization of the Abstract Text Summarization Model Based on Multi-Task Learning

被引:0
|
作者
Yao, Ben [1 ]
Ding, Gejian [1 ]
机构
[1] Zhejiang Normal Univ, Sch Comp Sci & Technol, Jinhua, Zhejiang, Peoples R China
关键词
Abstract text summarization; BART; Multi-task learning; Factual consistency;
D O I
10.1145/3650400.3650469
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic text summtarization technology is a crucial component in the field of natural language processing, utilized to address the demands of processing extensive textual data by effectively extracting key information to enhance task efficiency. With the rise of pretrained language models, abstract text summarization has progressively become mainstream, producing fluent srummaries that encapsulate core content. Nonetheless, abstract text summarization unavoidably faces problems of inconsistency with the original text. This paper introduces a sequence tagging task to achieve multi-task learning for abstract text summarization models. In this sequence tagging task, we meticulously designed annotated datasets at both entity and sentence levels based on an analysis of the XSum dataset, aiming to enhance the factual consistency of generated summaries. Experimental results demonstrate that the optimized BART model yields favorable performance in terms of ROUGE and FactCC metrics.
引用
收藏
页码:424 / 428
页数:5
相关论文
共 50 条
  • [1] A Multi-Task Learning Framework for Abstractive Text Summarization
    Lu, Yao
    Liu, Linqing
    Jiang, Zhile
    Yang, Min
    Goebel, Randy
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9987 - 9988
  • [2] A Dialogues Summarization Algorithm Based on Multi-task Learning
    Chen, Haowei
    Li, Chen
    Liang, Jiajing
    Tian, Lihua
    NEURAL PROCESSING LETTERS, 2024, 56 (03)
  • [3] Multi-task learning for abstractive text summarization with key information guide network
    Weiran Xu
    Chenliang Li
    Minghao Lee
    Chi Zhang
    EURASIP Journal on Advances in Signal Processing, 2020
  • [4] Long Text Summarization and Key Information Extraction in a Multi-Task Learning Framework
    Lu M.
    Chen R.
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [5] Multi-task learning for abstractive text summarization with key information guide network
    Xu, Weiran
    Li, Chenliang
    Lee, Minghao
    Zhang, Chi
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2020, 2020 (01)
  • [6] A Multi-task Text Classification Model Based on Label Embedding Learning
    Xu, Yuemei
    Fan, Zuwei
    Cao, Han
    CYBER SECURITY, CNCERT 2021, 2022, 1506 : 211 - 225
  • [7] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [8] Multi-Task Learning for Abstractive and Extractive Summarization
    Yangbin Chen
    Yun Ma
    Xudong Mao
    Qing Li
    Data Science and Engineering, 2019, 4 (1) : 14 - 23
  • [9] Power text information extraction based on multi-task learning
    Ji, Xin
    Wu, Tongxin
    Yu, Ting
    Dong, Linxiao
    Chen, Yiting
    Mi, Na
    Zhao, Jiakui
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2024, 50 (08): : 2461 - 2469
  • [10] Task-Aware Dynamic Model Optimization for Multi-Task Learning
    Choi, Sujin
    Jin, Hyundong
    Kim, Eunwoo
    IEEE ACCESS, 2023, 11 : 137709 - 137717