Controlling Translation Formality Using Pre-trained Multilingual Language Models

被引:0
|
作者
Rippeth, Elijah [1 ]
Agrawal, Sweta [1 ]
Carpuat, Marine [1 ]
机构
[1] Univ Maryland, Dept Comp Sci, College Pk, MD 20742 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes the University of Maryland's submission to the Special Task on Formality Control for Spoken Language Translation at IWSLT, which evaluates translation from English into 6 languages with diverse grammatical formality markers. We investigate to what extent this problem can be addressed with a single multilingual model, simultaneously controlling its output for target language and formality. Results show that this strategy can approach the translation quality and formality control achieved by dedicated translation models. However, the nature of the underlying pre-trained language model and of the finetuning samples greatly impact results.
引用
收藏
页码:327 / 340
页数:14
相关论文
共 50 条
  • [1] Multilingual Translation via Grafting Pre-trained Language Models
    Sun, Zewei
    Wang, Mingxuan
    Li, Lei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2735 - 2747
  • [2] How Linguistically Fair Are Multilingual Pre-Trained Language Models?
    Choudhury, Monojit
    Deshpande, Amit
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 12710 - 12718
  • [3] Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation
    Stickland, Asa Cooper
    Li, Xian
    Ghazvininejad, Marjan
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3440 - 3453
  • [4] Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation
    Wang, Chunliu
    Lai, Huiyuan
    Nissim, Malvina
    Bos, Johan
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5586 - 5600
  • [5] On the Language Neutrality of Pre-trained Multilingual Representations
    Libovicky, Jindrich
    Rosa, Rudolf
    Fraser, Alexander
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1663 - 1674
  • [6] Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
    Lee, En-Shiun Annie
    Thillainathan, Sarubi
    Nayak, Shravan
    Ranathunga, Surangika
    Adelani, David Ifeoluwa
    Su, Ruisi
    McCarthy, Arya D.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 58 - 67
  • [7] On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons
    Kojima, Takeshi
    Okimura, Itsuki
    Iwasawa, Yusuke
    Yanaka, Hitomi
    Matsuo, Yutaka
    arXiv,
  • [8] Probing language identity encoded in pre-trained multilingual models: a typological view
    Zheng, Jianyu
    Liu, Ying
    PEERJ COMPUTER SCIENCE, 2022, 7
  • [9] Probing language identity encoded in pre-trained multilingual models: a typological view
    Zheng J.
    Liu Y.
    PeerJ Computer Science, 2022, 8
  • [10] Emotional Paraphrasing Using Pre-trained Language Models
    Casas, Jacky
    Torche, Samuel
    Daher, Karl
    Mugellini, Elena
    Abou Khaled, Omar
    2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,