共 50 条
- [41] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
- [42] Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing APPLIED SCIENCES-BASEL, 2023, 13 (07):
- [43] Fine-Tuning Pre-Trained Model for Consumer Fraud Detection from Consumer Reviews DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2023, PT II, 2023, 14147 : 451 - 456
- [44] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
- [45] Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12081 - 12095
- [46] Virtual Data Augmentation: A Robust and General Framework for Fine-tuning Pre-trained Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3875 - 3887
- [48] Disfluencies and Fine-Tuning Pre-trained Language Models for Detection of Alzheimer's Disease INTERSPEECH 2020, 2020, : 2162 - 2166
- [49] Make Pre-trained Model Reversible: From Parameter to Memory Efficient Fine-Tuning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [50] Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 39 - 51