共 50 条
- [1] Self-training Improves Pre-training for Natural Language Understanding 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5408 - 5418
- [2] SELF-TRAINING AND PRE-TRAINING ARE COMPLEMENTARY FOR SPEECH RECOGNITION 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3030 - 3034
- [3] Task-adaptive Pre-training of Language Models withWord Embedding Regularization FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4546 - 4553
- [4] Rethinking Pre-training and Self-training ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [5] Code Question Answering via Task-Adaptive Sequence-to-Sequence Pre-training 2022 29TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE, APSEC, 2022, : 229 - 238
- [6] Unified Language Model Pre-training for Natural Language Understanding and Generation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [7] A pre-training and self-training approach for biomedical named entity recognition PLOS ONE, 2021, 16 (02):
- [8] Self-training Improves Pre-training for Few-shot Learning in Task-oriented Dialog Systems 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1887 - 1898
- [9] Unsupervised Video Domain Adaptation with Masked Pre-Training and Collaborative Self-Training 2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 18919 - 18929
- [10] MVP: Multi-task Supervised Pre-training for Natural Language Generation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8758 - 8794