共 50 条
- [42] Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey Machine Intelligence Research, 2023, 20 : 447 - 482
- [43] Give Me the Facts! A Survey on Factual Knowledge Probing in Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 15588 - 15605
- [44] Probing Pre-Trained Language Models for Disease Knowledge FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033
- [45] Analyzing Individual Neurons in Pre-trained Language Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4865 - 4880
- [46] Emotional Paraphrasing Using Pre-trained Language Models 2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,
- [49] Dynamic Knowledge Distillation for Pre-trained Language Models 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 379 - 389
- [50] Backdoor Pre-trained Models Can Transfer to All CCS '21: PROCEEDINGS OF THE 2021 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2021, : 3141 - 3158