共 50 条
- [1] Continual Learning with Pre-Trained Models: A Survey PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8363 - 8371
- [3] Detecting Backdoors in Pre-trained Encoders 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16352 - 16362
- [4] RanPAC: Random Projections and Pre-trained Models for Continual Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [5] Do Pre-trained Models Benefit Equally in Continual Learning? 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6474 - 6482
- [7] Recyclable Tuning for Continual Pre-training FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 11403 - 11426
- [8] Preserving Cross-Linguality of Pre-trained Models via Continual Learning REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 64 - 71
- [9] Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 532 - 542
- [10] The Impact of Training Methods on the Development of Pre-Trained Language Models COMPUTACION Y SISTEMAS, 2024, 28 (01): : 109 - 124