共 50 条
- [22] Self-Reported Attention Control Skills Moderate the Effect of Self-Focused Attention on Depression SAGE OPEN, 2021, 11 (02):
- [23] Self-attention in vision transformers performs perceptual grouping, not attention FRONTIERS IN COMPUTER SCIENCE, 2023, 5
- [24] Enhancing Self-Attention with Knowledge-Assisted Attention Maps NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 107 - 115
- [25] SGSAFormer: Spike Gated Self-Attention Transformer and Temporal Attention ELECTRONICS, 2025, 14 (01):
- [26] On The Computational Complexity of Self-Attention INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 597 - 619
- [27] The Lipschitz Constant of Self-Attention INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [30] Convolutional Self-Attention Networks 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4040 - 4045