共 50 条
- [2] Probing Multi-modal Machine Translation with Pre-trained Language Model FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3689 - 3699
- [3] Large-scale Multi-modal Pre-trained Models: A Comprehensive Survey Machine Intelligence Research, 2023, 20 : 447 - 482
- [5] Are Pre-trained Convolutions Better than Pre-trained Transformers? 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
- [6] MultiFusion: Fusing Pre-Trained Models for Multi-Lingual, Multi-Modal Image Generation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [7] Multi-modal Segmentation with Missing MR Sequences Using Pre-trained Fusion Networks DOMAIN ADAPTATION AND REPRESENTATION TRANSFER AND MEDICAL IMAGE LEARNING WITH LESS LABELS AND IMPERFECT DATA, DART 2019, MIL3ID 2019, 2019, 11795 : 165 - 172
- [8] Calibration of Pre-trained Transformers PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
- [9] Cyberbullying detection on multi-modal data using pre-trained deep learning architectures INGENIERIA SOLIDARIA, 2021, 17 (03):
- [10] Fast multi-modal reuse: Co-occurrence pre-trained deep learning models Proceedings of SPIE - The International Society for Optical Engineering, 2019, 10996