共 50 条
- [41] DistillingWord Meaning in Context from Pre-trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 534 - 546
- [42] An Investigation of Suitability of Pre-Trained Language Models for Dialogue Generation - Avoiding Discrepancies FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4481 - 4494
- [44] On the generation of multi-label prototypes INTELLIGENT DATA ANALYSIS, 2020, 24 (S1) : S167 - S183
- [47] Annotating Columns with Pre-trained Language Models PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503
- [50] Interpreting Art by Leveraging Pre-Trained Models 2023 18TH INTERNATIONAL CONFERENCE ON MACHINE VISION AND APPLICATIONS, MVA, 2023,