共 32 条
- [1] Improving News Recommendation via Bottlenecked Multi-task Pre-training PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 2082 - 2086
- [2] CLMSM: A Multi-Task Learning Framework for Pre-training on Procedural Text FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 8793 - 8806
- [3] Improving Transformer-based Speech Recognition with Unsupervised Pre-training and Multi-task Semantic Knowledge Learning INTERSPEECH 2020, 2020, : 5006 - 5010
- [5] Pre-training Multi-task Contrastive Learning Models for Scientific Literature Understanding FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12259 - 12275
- [10] Multi-task Pre-training with Soft Biometrics for Transfer-learning Palmprint Recognition Neural Processing Letters, 2023, 55 : 2341 - 2358