共 50 条
- [11] Transformer-Based Dual-Channel Self-Attention for UUV Autonomous Collision Avoidance IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (03): : 2319 - 2331
- [12] In-Memory Transformer Self-Attention Mechanism Using Passive Memristor Crossbar 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
- [13] Transformer-based end-to-end speech recognition with residual Gaussian-based self-attention Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, 2021, 2 : 1495 - 1499
- [14] Regularizing Transformer-based Acoustic Models by Penalizing Attention Weights for Robust Speech Recognition INTERSPEECH 2022, 2022, : 56 - 60
- [15] Self-and-Mixed Attention Decoder with Deep Acoustic Structure for Transformer-based LVCSR INTERSPEECH 2020, 2020, : 5016 - 5020
- [17] ET: Re -Thinking Self-Attention for Transformer Models on GPUs SC21: INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2021,
- [19] Vision Transformer Based on Reconfigurable Gaussian Self-attention Zidonghua Xuebao/Acta Automatica Sinica, 2023, 49 (09): : 1976 - 1988
- [20] Self-Distillation into Self-Attention Heads for Improving Transformer-based End-to-End Neural Speaker Diarization INTERSPEECH 2023, 2023, : 3197 - 3201