共 50 条
- [42] Towards Efficiently Learning Monotonic Alignments for Attention-Based End-to-End Speech Recognition INTERSPEECH 2022, 2022, : 1051 - 1055
- [43] Unsupervised Domain Adaptation via Bidirectional Cross-Attention Transformer MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT V, 2023, 14173 : 309 - 325
- [45] A Dual Cross Attention Transformer Network for Infrared and Visible Image Fusion 2024 7TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA, ICAIBD 2024, 2024, : 494 - 499
- [48] CROSSFORMER: A VERSATILE VISION TRANSFORMER HINGING ON CROSS-SCALE ATTENTION ICLR 2022 - 10th International Conference on Learning Representations, 2022,
- [50] Optimization-Inspired Cross-Attention Transformer for Compressive Sensing 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 6174 - 6184