共 50 条
- [21] Tempo: Accelerating Transformer-Based Model Training through Memory Footprint Reduction ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [22] In-Context Learning for MIMO Equalization Using Transformer-Based Sequence Models 2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1573 - 1578
- [23] Leveraging Unlabeled Speech for Sequence Discriminative Training of Acoustic Models INTERSPEECH 2020, 2020, : 3585 - 3589
- [24] High performance binding affinity prediction with a Transformer-based surrogate model 2024 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS, IPDPSW 2024, 2024, : 571 - 580
- [28] Transformer-based temporal sequence learners for arrhythmia classification Medical & Biological Engineering & Computing, 2023, 61 : 1993 - 2000
- [29] Ouroboros: On Accelerating Training of Transformer-Based Language Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [30] Streaming Transformer-based Acoustic Models Using Self-attention with Augmented Memory INTERSPEECH 2020, 2020, : 2132 - 2136