Rethinking Translation Memory Augmented Neural Machine Translation

被引:0
|
作者
Hao, Hongkun [1 ,2 ]
Huang, Guoping [2 ]
Liu, Lemao [2 ]
Zhang, Zhirui [2 ]
Shi, Shuming [2 ]
Wang, Rui [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] Tencent AI Lab, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper rethinks translation memory augmented neural machine translation (TM-augmented NMT) from two perspectives, i.e., a probabilistic view of retrieval and the variance-bias decomposition principle. The finding demonstrates that TM-augmented NMT is good at the ability of fitting data (i.e., lower bias) but is more sensitive to the fluctuations in the training data (i.e., higher variance), which provides an explanation to a recently reported contradictory phenomenon on the same translation task: TM-augmented NMT substantially advances vanilla NMT under the high-resource scenario whereas it fails under the low-resource scenario. Then we propose a simple yet effective TM-augmented NMT model to promote the variance and address the contradictory phenomenon. Extensive experiments show that the proposed TM-augmented NMT achieves consistent gains over both conventional NMT and existing TM-augmented NMT under two variance-preferable (low-resource and plug-and-play) scenarios as well as the high-resource scenario.
引用
收藏
页码:2589 / 2605
页数:17
相关论文
共 50 条
  • [1] Memory-augmented Chinese-Uyghur Neural Machine Translation
    Zhang, Shiyue
    Mahmut, Gulnigar
    Wang, Dong
    Hamdulla, Askar
    2017 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC 2017), 2017, : 1092 - 1096
  • [2] Graph Based Translation Memory for Neural Machine Translation
    Xia, Mengzhou
    Huang, Guoping
    Liu, Lemao
    Shi, Shuming
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7297 - 7304
  • [3] Encoding Gated Translation Memory into Neural Machine Translation
    Cao, Qian
    Xiong, Deyi
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3042 - 3047
  • [4] Fast and Accurate Neural Machine Translation with Translation Memory
    He, Qiuxiang
    Huang, Guoping
    Cui, Qu
    Li, Li
    Liu, Lemao
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3170 - 3180
  • [5] Neural Machine Translation with Key-Value Memory-Augmented Attention
    Meng, Fandong
    Tu, Zhaopeng
    Cheng, Yong
    Wu, Haiyang
    Zhai, Junjie
    Yang, Yuekui
    Wang, Di
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2574 - 2580
  • [6] Word Position Aware Translation Memory for Neural Machine Translation
    He, Qiuxiang
    Huang, Guoping
    Liu, Lemao
    Li, Li
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 367 - 379
  • [7] Neural Machine Translation with Diversity-Enabled Translation Memory
    Quang Chieu Nguyen
    Xuan Dung Doan
    Van-Vinh Nguyen
    Khac-Hoai Nam Bui
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2023, PT I, 2023, 13995 : 322 - 333
  • [8] Rethinking Document-level Neural Machine Translation
    Sun, Zewei
    Wang, Mingxuan
    Zhou, Hao
    Zhao, Chengqi
    Huang, Shujian
    Chen, Jiajun
    Li, Lei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3537 - 3548
  • [9] Augmented Spanish-Persian Neural Machine Translation
    Ahmadnia, Benyamin
    Aranovich, Raul
    ICAART: PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2021, : 482 - 488
  • [10] Diverse Machine Translation with Translation Memory
    Zhang, Yi
    Zhao, Jing
    Sun, Shiliang
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,