Optimization of the Abstract Text Summarization Model Based on Multi-Task Learning

被引:0
|
作者
Yao, Ben [1 ]
Ding, Gejian [1 ]
机构
[1] Zhejiang Normal Univ, Sch Comp Sci & Technol, Jinhua, Zhejiang, Peoples R China
关键词
Abstract text summarization; BART; Multi-task learning; Factual consistency;
D O I
10.1145/3650400.3650469
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic text summtarization technology is a crucial component in the field of natural language processing, utilized to address the demands of processing extensive textual data by effectively extracting key information to enhance task efficiency. With the rise of pretrained language models, abstract text summarization has progressively become mainstream, producing fluent srummaries that encapsulate core content. Nonetheless, abstract text summarization unavoidably faces problems of inconsistency with the original text. This paper introduces a sequence tagging task to achieve multi-task learning for abstract text summarization models. In this sequence tagging task, we meticulously designed annotated datasets at both entity and sentence levels based on an analysis of the XSum dataset, aiming to enhance the factual consistency of generated summaries. Experimental results demonstrate that the optimized BART model yields favorable performance in terms of ROUGE and FactCC metrics.
引用
收藏
页码:424 / 428
页数:5
相关论文
共 50 条
  • [31] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [32] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [33] A Multi-domain Text Classification Method Based on Recurrent Convolution Multi-task Learning
    Xie Jinbao
    Li Jiahui
    Kang Shouqiang
    Wang Qingyan
    Wang Yujing
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (08) : 2395 - 2403
  • [34] Ask the GRU: Multi-task Learning for Deep Text Recommendations
    Bansal, Trapit
    Belanger, David
    McCallum, Andrew
    PROCEEDINGS OF THE 10TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'16), 2016, : 107 - 114
  • [35] Multi-task Learning with Bidirectional Language Models for Text Classification
    Yang, Qi
    Shang, Lin
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [36] A Multi-Task Text Classification Model Based on Label Embedding of Attention Mechanism
    Yuemei X.
    Zuwei F.
    Han C.
    Data Analysis and Knowledge Discovery, 2022, 6 (2-3): : 105 - 116
  • [37] Multi-task learning for historical text normalization: Size matters
    Bollmann, Marcel
    Sogaard, Anders
    Bingel, Joachim
    DEEP LEARNING APPROACHES FOR LOW-RESOURCE NATURAL LANGUAGE PROCESSING (DEEPLO), 2018, : 19 - 24
  • [38] Multi-task learning using a hybrid representation for text classification
    Guangquan Lu
    Jiangzhang Gan
    Jian Yin
    Zhiping Luo
    Bo Li
    Xishun Zhao
    Neural Computing and Applications, 2020, 32 : 6467 - 6480
  • [39] CoTexT: Multi-task Learning with Code-Text Transformer
    Long Phan
    Hieu Tran
    Le, Daniel
    Hieu Nguyen
    Anibal, James
    Peltekian, Alec
    Ye, Yanfang
    NLP4PROG 2021: THE 1ST WORKSHOP ON NATURAL LANGUAGE PROCESSING FOR PROGRAMMING (NLP4PROG 2021), 2021, : 40 - 47
  • [40] Optimization of Deep Reinforcement Learning with Hybrid Multi-Task Learning
    Varghese, Nelson Vithayathil
    Mahmoud, Qusay H.
    2021 15TH ANNUAL IEEE INTERNATIONAL SYSTEMS CONFERENCE (SYSCON 2021), 2021,