共 50 条
- [1] Sparsing and Smoothing for the seq2seq Models IEEE Transactions on Artificial Intelligence, 2023, 4 (03): : 464 - 472
- [2] Learning the Dyck Language with Attention-based Seq2Seq Models BLACKBOXNLP WORKSHOP ON ANALYZING AND INTERPRETING NEURAL NETWORKS FOR NLP AT ACL 2019, 2019, : 138 - 146
- [5] Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 9380 - 9394
- [6] Profanity-Avoiding Training Framework for Seq2seq Models with Certified Robustness 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5151 - 5161
- [8] Seq2Seq Deep Learning Models for Microtext Normalization 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
- [9] Learning Transductions and Alignments with RNN Seq2seq Models INTERNATIONAL CONFERENCE ON GRAMMATICAL INFERENCE, VOL 217, 2023, 217 : 223 - 249
- [10] Seq2Seq Surrogates of Epidemic Models to Facilitate Bayesian Inference THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 12, 2023, : 14170 - 14177