Based on Gated Dynamic Encoding Optimization, the LGE-Transformer Method for Low-Resource Neural Machine Translation

被引:0
|
作者
Xu, Zhizhan [1 ]
Zhan, Siqi [2 ]
Yang, Wei [1 ]
Xie, Qianglai [1 ]
机构
[1] Jiangxi Univ Technol, Collaborat Innovat Ctr, Big Data Lab, Nanchang, Peoples R China
[2] Jiangxi Univ Technol, Coll Informat Engn, Nanchang, Peoples R China
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Transformers; Logic gates; Encoding; Decoding; Neural machine translation; Context modeling; Feature extraction; Vectors; Semantics; Linguistics; Natural language processing; Pretraining language model; GRU gating; Chinese-Malay machine translation; low resource; deep learning;
D O I
10.1109/ACCESS.2024.3488186
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In current Neural Machine Translation (NMT) research, translating low-resource language pairs remains a significant challenge. This work proposes an LGE-Transformer method for Chinese-Malay neural machine translation based on Gated Dynamic Encoding Optimization. By introducing a linguistically enhanced LERT pre-training model, the Transformer encoder is reconstructed and integrated with a gated dynamic encoding module, effectively integrating features from various encoder layers and enhancing the model's representation capability in low-resource language pairs. On the encoder side, the proposed method achieves an adaptive fusion of multi-layer encoder outputs through the gated dynamic encoding module, enabling the model to fully utilize feature information from all layers, thereby improving translation accuracy and fluency. On the decoder side, we introduce a hybrid cross-attention module, further enhancing the model's attention to contextual information, and thereby improving the semantic accuracy of the translation results. Experimental results on the Chinese-Malay low-resource translation task demonstrate that the proposed LGE-Transformer method significantly outperforms the baseline and other experimental models in terms of BLEU scores, validating the effectiveness and superiority of the gated dynamic encoding optimization-based neural machine translation method in low-resource language pair translation tasks.
引用
收藏
页码:162861 / 162869
页数:9
相关论文
共 50 条
  • [1] A Survey on Low-Resource Neural Machine Translation
    Wang, Rui
    Tan, Xu
    Luo, Renqian
    Qin, Tao
    Liu, Tie-Yan
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4636 - 4643
  • [2] Transformers for Low-resource Neural Machine Translation
    Gezmu, Andargachew Mekonnen
    Nuernberger, Andreas
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2022, : 459 - 466
  • [3] A Survey on Low-resource Neural Machine Translation
    Li H.-Z.
    Feng C.
    Huang H.-Y.
    Huang, He-Yan (hhy63@bit.edu.cn), 1600, Science Press (47): : 1217 - 1231
  • [4] A Content Word Augmentation Method for Low-Resource Neural Machine Translation
    Li, Fuxue
    Zhao, Zhongchao
    Chi, Chuncheng
    Yan, Hong
    Zhang, Zhen
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT IV, 2023, 14089 : 720 - 731
  • [5] Low-Resource Neural Machine Translation with Neural Episodic Control
    Wu, Nier
    Hou, Hongxu
    Sun, Shuo
    Zheng, Wei
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Boosting the Transformer with the BERT Supervision in Low-Resource Machine Translation
    Yan, Rong
    Li, Jiang
    Su, Xiangdong
    Wang, Xiaoming
    Gao, Guanglai
    APPLIED SCIENCES-BASEL, 2022, 12 (14):
  • [7] DRA: dynamic routing attention for neural machine translation with low-resource languages
    Wang, Zhenhan
    Song, Ran
    Yu, Zhengtao
    Mao, Cunli
    Gao, Shengxiang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024,
  • [8] Low-resource Neural Machine Translation: Methods and Trends
    Shi, Shumin
    Wu, Xing
    Su, Rihai
    Huang, Heyan
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (05)
  • [9] Neural Machine Translation for Low-resource Languages: A Survey
    Ranathunga, Surangika
    Lee, En-Shiun Annie
    Skenduli, Marjana Prifti
    Shekhar, Ravi
    Alam, Mehreen
    Kaur, Rishemjit
    ACM COMPUTING SURVEYS, 2023, 55 (11)
  • [10] Data Augmentation for Low-Resource Neural Machine Translation
    Fadaee, Marzieh
    Bisazza, Arianna
    Monz, Christof
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 2, 2017, : 567 - 573