共 50 条
- [44] Self-Reported Attention Control Skills Moderate the Effect of Self-Focused Attention on Depression SAGE OPEN, 2021, 11 (02):
- [45] CONNECTING ROD CURVES GENERATED BY THE R-PPR-PPR-PRP MECHANISM ACTA TECHNICA NAPOCENSIS SERIES-APPLIED MATHEMATICS MECHANICS AND ENGINEERING, 2022, 65 : 331 - 336
- [46] Self-attention in vision transformers performs perceptual grouping, not attention FRONTIERS IN COMPUTER SCIENCE, 2023, 5
- [47] Enhancing Self-Attention with Knowledge-Assisted Attention Maps NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 107 - 115
- [48] SGSAFormer: Spike Gated Self-Attention Transformer and Temporal Attention ELECTRONICS, 2025, 14 (01):
- [49] On The Computational Complexity of Self-Attention INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 597 - 619
- [50] The Lipschitz Constant of Self-Attention INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139