共 50 条
- [23] Syntax-Directed Attention for Neural Machine Translation THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4792 - 4799
- [24] Dynamic Attention Aggregation with BERT for Neural Machine Translation 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
- [25] Synchronous Syntactic Attention for Transformer Neural Machine Translation ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2021, : 348 - 355
- [28] Measuring and Improving Faithfulness of Attention in Neural Machine Translation 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2791 - 2802
- [29] Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation NEURAL GENERATION AND TRANSLATION, 2020, : 110 - 118
- [30] Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads? FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2832 - 2841