Attentional control and the self: The Self-Attention Network (SAN)

被引:218
|
作者
Humphreys, Glyn W. [1 ]
Sui, Jie [1 ]
机构
[1] Univ Oxford, Dept Expt Psychol, Oxford OX2 6JD, England
基金
英国经济与社会研究理事会;
关键词
Self-bias; Attention; Own-name effect; Own-face effect; REPETITION BLINDNESS; NEURAL MECHANISMS; FACE; PERCEPTION; RECOGNITION; FAMILIARITY; GUIDANCE; MEMORY; BIASES; NAMES;
D O I
10.1080/17588928.2015.1044427
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Although there is strong evidence that human decision-making is frequently self-biased, it remains unclear whether self-biases mediate attention. Here we review evidence on the relations between self-bias effects in decision-making and attention. We ask: Does self-related information capture attention? Do self-biases modulate pre-attentive processes or do they depend on attentional resources being available? We review work on (1) own-name effects, (2) own-face effects, and (3) self-biases in associative matching. We argue that self-related information does have a differential impact on the allocation of attention and that it can alter the saliency of a stimulus in a manner that mimics the effects of perceptual-saliency. However, there is also evidence that self-biases depend on the availability of attentional resources and attentional expectancies for upcoming stimuli. We propose a new processing framework, the Self-Attention Network (SAN), in which neural circuits responding to self-related stimuli interact with circuits supporting attentional control, to determine our emergent behavior. We also discuss how these-bias effects may extend beyond the self to be modulated by the broader social context-for example, by cultural experience, by an in-group as opposed to an out-group stimulus, and by whether we are engaged in joint actions. Self-biases on attention are modulated by social context.
引用
收藏
页码:5 / 17
页数:13
相关论文
共 50 条
  • [41] Graph neural network with self-attention for material discovery
    Chen, Xuesi
    Jiang, Hantong
    Lin, Xuanjie
    Ren, Yongsheng
    Wu, Congzhong
    Zhan, Shu
    Ma, Wenhui
    MOLECULAR PHYSICS, 2023, 121 (04)
  • [42] A Dual Self-Attention based Network for Image Captioning
    Li, ZhiYong
    Yang, JinFu
    Li, YaPing
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 1590 - 1595
  • [43] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169
  • [44] A Gated Self-attention Memory Network for Answer Selection
    Lai, Tuan
    Tran, Quan Hung
    Bui, Trung
    Kihara, Daisuke
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5953 - 5959
  • [45] Investigating Self-Attention Network for Chinese Word Segmentation
    Gan, Leilei
    Zhang, Yue
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 2933 - 2941
  • [46] Multiple Positional Self-Attention Network for Text Classification
    Dai, Biyun
    Li, Jinlong
    Xu, Ruoyi
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7610 - 7617
  • [47] Multimodal cooperative self-attention network for action recognition
    Zhong, Zhuokun
    Hou, Zhenjie
    Liang, Jiuzhen
    Lin, En
    Shi, Haiyong
    IET IMAGE PROCESSING, 2023, 17 (06) : 1775 - 1783
  • [48] Transformer Self-Attention Network for Forecasting Mortality Rates
    Roshani, Amin
    Izadi, Muhyiddin
    Khaledi, Baha-Eldin
    JIRSS-JOURNAL OF THE IRANIAN STATISTICAL SOCIETY, 2022, 21 (01): : 81 - 103
  • [49] Exception Handling Recommendation Based on Self-Attention Network
    Lin, Kai
    Tao, Chuanqi
    Huang, Zhiqiu
    2021 IEEE INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING WORKSHOPS (ISSREW 2021), 2021, : 282 - 283
  • [50] Attention and self-attention in random forests
    Lev V. Utkin
    Andrei V. Konstantinov
    Stanislav R. Kirpichenko
    Progress in Artificial Intelligence, 2023, 12 : 257 - 273