Unifying Cross-lingual Summarization and Machine Translation with Compression Rate

被引:5
|
作者
Bai, Yu [1 ,2 ]
Huang, Heyan [1 ,3 ]
Fan, Kai [4 ]
Gao, Yang [1 ]
Zhu, Yiming [1 ]
Zhan, Jiaao [1 ]
Chi, Zewen [1 ]
Chen, Boxing [4 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing, Peoples R China
[2] Beijing Engn Res Ctr High Volume Language Informa, Beijing, Peoples R China
[3] Southeast Acad Informat Technol, Putian, Fujian, Peoples R China
[4] Alibaba DAMO Acad, Machine Intelligence Technol Lab, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Cross-lingual Summarization; Machine Translation; Compression Rate;
D O I
10.1145/3477495.3532071
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cross-Lingual Summarization (CLS) is a task that extracts important information from a source document and summarizes it into a summary in another language. It is a challenging task that requires a system to understand, summarize, and translate at the same time, making it highly related to Monolingual Summarization (MS) and Machine Translation (MT). In practice, the training resources for Machine Translation are far more than that for cross-lingual and monolingual summarization. Thus incorporating the Machine Translation corpus into CLS would be beneficial for its performance. However, the present work only leverages a simple multi-task framework to bring Machine Translation in, lacking deeper exploration. In this paper, we propose a novel task, Cross-lingual Summarization with Compression rate (CSC), to benefit Cross-Lingual Summarization by large-scale Machine Translation corpus. Through introducing compression rate, the information ratio between the source and the target text, we regard the MT task as a special CLS task with a compression rate of 100%. Hence they can be trained as a unified task, sharing knowledge more effectively. However, a huge gap exists between the MT task and the CLS task, where samples with compression rates between 30% and 90% are extremely rare. Hence, to bridge these two tasks smoothly, we propose an effective data augmentation method to produce document-summary pairs with different compression rates. The proposed method not only improves the performance of the CLS task, but also provides controllability to generate summaries in desired lengths. Experiments demonstrate that our method outperforms various strong baselines in three cross-lingual summarization datasets. We released our code and data at https://github.com/ybai-nlp/CLS_CIR.
引用
收藏
页码:1087 / 1097
页数:11
相关论文
共 50 条
  • [21] A Robust Abstractive System for Cross-Lingual Summarization
    Ouyang, Jessica
    Song, Boya
    McKeown, Kathleen
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2025 - 2031
  • [22] CLIReval: Evaluating Machine Translation as a Cross-Lingual Information Retrieval Task
    Sun, Shuo
    Sia, Suzanna
    Duh, Kevin
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020): SYSTEM DEMONSTRATIONS, 2020, : 134 - 141
  • [23] Mixed-Lingual Pre-training for Cross-lingual Summarization
    Xu, Ruochen
    Zhu, Chenguang
    Shi, Yu
    Zeng, Michael
    Huang, Xuedong
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 536 - 541
  • [25] Cross-lingual Visual Pre-training for Multimodal Machine Translation
    Caglayan, Ozan
    Kuyu, Menekse
    Amac, Mustafa Sercan
    Madhyastha, Pranava
    Erdem, Erkut
    Erdem, Aykut
    Specia, Lucia
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1317 - 1324
  • [26] Entity Projection via Machine-Translation for Cross-Lingual NER
    Jain, Alankar
    Paranjape, Bhargavi
    Lipton, Zachary C.
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1083 - +
  • [27] A Low Cost Machine Translation Method for Cross-Lingual Information Retrieval
    Bracewell, David B.
    Ren, Fuji
    Kuroiwa, Shingo
    ENGINEERING LETTERS, 2008, 16 (01)
  • [28] Explicit Cross-lingual Pre-training for Unsupervised Machine Translation
    Ren, Shuo
    Wu, Yu
    Liu, Shujie
    Zhou, Ming
    Ma, Shuai
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 770 - 779
  • [29] Unsupervised Neural Machine Translation With Cross-Lingual Language Representation Agreement
    Sun, Haipeng
    Wang, Rui
    Chen, Kehai
    Utiyama, Masao
    Sumita, Eiichiro
    Zhao, Tiejun
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 (28) : 1170 - 1182
  • [30] Reading Comprehension in Czech via Machine Translation and Cross-Lingual Transfer
    Mackova, Katerina
    Straka, Milan
    TEXT, SPEECH, AND DIALOGUE (TSD 2020), 2020, 12284 : 171 - 179