共 50 条
- [31] JiuZhang 2.0: A Unified Chinese Pre-trained Language Model for Multi-task Mathematical Problem Solving PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5660 - 5672
- [33] Transformer-based transfer learning and multi-task learning for improving the performance of speech emotion recognition JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA, 2021, 40 (05): : 515 - 522
- [34] Multi-Task Conformer with Multi-Feature Combination for Speech Emotion Recognition SYMMETRY-BASEL, 2022, 14 (07):
- [35] When to Use Multi-Task Learning vs Intermediate Fine-Tuning for Pre-Trained Encoder Transfer Learning PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 272 - 282
- [36] Automatic Speech Recognition Dataset Augmentation with Pre-Trained Model and Script 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2019, : 649 - 651
- [38] Towards Speech Emotion Recognition "in the wild" using Aggregated Corpora and Deep Multi-Task Learning 18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, : 1113 - 1117
- [39] MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders Are Better Dense Retrievers MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT II, 2023, 14170 : 630 - 647
- [40] Multi-task Recurrent Model for Speech and Speaker Recognition 2016 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA), 2016,