An empirical study of low-resource neural machine translation of manipuri in multilingual settings

被引:0
|
作者
Singh, Salam Michael [1 ]
Singh, Thoudam Doren [1 ]
机构
[1] Department of Computer Science and Engineering, National Institute of Technology Silchar, Assam, Silchar,788010, India
关键词
Computational linguistics - Computer aided language translation - Long short-term memory;
D O I
暂无
中图分类号
学科分类号
摘要
Machine translation requires a large amount of parallel data for a production level of translation quality. This is one of the significant factors behind the lack of machine translation systems for most spoken/written languages. Likewise, Manipuri is a low resource Indian language, and there is very little digital textual available data for the same. In this work, we attempt to address the low resource neural machine translation for Manipuri and English using other Indian languages in a multilingual setup. We train an LSTM based many-to-many multilingual neural machine translation system that is infused with cross-lingual features. Experimental results show that our method improves over the vanilla many-to-many multilingual and bilingual baselines for both Manipuri to/from English translation tasks. Furthermore, our method also improves over the vanilla many-to-many multilingual system for the translation task of all the other Indian languages to/from English. We also examine the generalizability of our multilingual model by evaluating the translation among the language pairs which do not have a direct link via the zero-shot translation and compare it against the pivot-based translation. © 2022, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
引用
收藏
页码:14823 / 14844
相关论文
共 50 条
  • [1] An empirical study of low-resource neural machine translation of manipuri in multilingual settings
    Singh, Salam Michael
    Singh, Thoudam Doren
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 14823 - 14844
  • [2] An empirical study of low-resource neural machine translation of manipuri in multilingual settings
    Salam Michael Singh
    Thoudam Doren Singh
    Neural Computing and Applications, 2022, 34 : 14823 - 14844
  • [3] An Analysis of Massively Multilingual Neural Machine Translation for Low-Resource Languages
    Mueller, Aaron
    Nicolai, Garrett
    McCarthy, Arya D.
    Lewis, Dylan
    Wu, Winston
    Yarowsky, David
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 3710 - 3718
  • [4] Incremental Domain Adaptation for Neural Machine Translation in Low-Resource Settings
    Kalimuthu, Marimuthu
    Barz, Michael
    Sonntag, Daniel
    FOURTH ARABIC NATURAL LANGUAGE PROCESSING WORKSHOP (WANLP 2019), 2019, : 1 - 10
  • [5] The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation
    Ahia, Orevaoghene
    Kreutzer, Julia
    Hooker, Sara
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3316 - 3333
  • [6] Extremely Low-resource Multilingual Neural Machine Translation for Indic Mizo Language
    Lalrempuii C.
    Soni B.
    International Journal of Information Technology, 2023, 15 (8) : 4275 - 4282
  • [7] Multilingual neural machine translation for low-resource languages by twinning important nodes
    Qorbani, Abouzar
    Ramezani, Reza
    Baraani, Ahmad
    Kazemi, Arefeh
    NEUROCOMPUTING, 2025, 634
  • [8] A Survey on Low-Resource Neural Machine Translation
    Wang, Rui
    Tan, Xu
    Luo, Renqian
    Qin, Tao
    Liu, Tie-Yan
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4636 - 4643
  • [9] A Survey on Low-resource Neural Machine Translation
    Li H.-Z.
    Feng C.
    Huang H.-Y.
    Huang, He-Yan (hhy63@bit.edu.cn), 1600, Science Press (47): : 1217 - 1231
  • [10] Transformers for Low-resource Neural Machine Translation
    Gezmu, Andargachew Mekonnen
    Nuernberger, Andreas
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2022, : 459 - 466