Emotion-Aware Human Attention Prediction

被引:17
|
作者
Cordel, Macario O., II [1 ,2 ]
Fan, Shaojing [2 ]
Shen, Zhiqi [2 ]
Kankanhalli, Mohan S. [2 ]
机构
[1] De La Salle Univ, Manila, Philippines
[2] Natl Univ Singapore, Singapore, Singapore
基金
新加坡国家研究基金会;
关键词
D O I
10.1109/CVPR.2019.00415
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the recent success in face recognition and object classification, in the field of human gaze prediction, computer models are still struggling to accurately mimic human attention. One main reason is that visual attention is a complex human behavior influenced by multiple factors, ranging from low-level features (e.g., color, contrast) to high-level human perception (e.g., objects interactions, object sentiment), making it difficult to model computationally. In this work, we investigate the relation between object sentiment and human attention. We first introduce an improved evaluation metric (AttI) for measuring human attention that focuses on human fixation consensus. A series of empirical data analyses with AttI indicate that emotione-voking objects receive attention favor, especially when they co-occur with emotionally-neutral objects, and this favor varies with different image complexity. Based on the empirical analyses, we design a deep neural network for human attention prediction which allows the attention bias on emotion-evoking objects to be encoded in its feature space. Experiments on two benchmark datasets demonstrate its superior performance, especially on metrics that evaluate relative importance of salient regions. This research provides the clearest picture to date on how object sentiments influence human attention, and it makes one of the first attempts to model this phenomenon computationally.
引用
收藏
页码:4021 / 4030
页数:10
相关论文
共 50 条
  • [21] Emotion-Aware Speaker Identification With Transfer Learning
    Noh, Kyoungju
    Jeong, Hyuntae
    IEEE ACCESS, 2023, 11 : 77292 - 77306
  • [22] EmoMTB: Emotion-aware Music Tower Blocks
    Melchiorre, Alessandro B.
    Penz, David
    Ganhoer, Christian
    Lesota, Oleg
    Fragoso, Vasco
    Fritzl, Florian
    Parada-Cabaleiro, Emilia
    Schubert, Franz
    Schedl, Markus
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2022, 2022, : 206 - 210
  • [23] Emotion-Aware Music Driven Movie Montage
    Liu, Wu-Qin
    Lin, Min-Xuan
    Huang, Hai-Bin
    Ma, Chong-Yang
    Song, Yu
    Dong, Wei-Ming
    Xu, Chang-Sheng
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2023, 38 (03) : 540 - 553
  • [24] Physiological mouse: toward an emotion-aware mouse
    Yujun Fu
    Hong Va Leong
    Grace Ngai
    Michael Xuelin Huang
    Stephen C. F. Chan
    Universal Access in the Information Society, 2017, 16 : 365 - 379
  • [25] Towards Emotion-Aware Agents For Negotiation Dialogues
    Chawla, Kushal
    Clever, Rene
    Ramirez, Jaysa
    Lucas, Gale
    Gratch, Jonathan
    2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2021,
  • [26] Modeling Protagonist Emotions for Emotion-Aware Storytelling
    Brahman, Faeze
    Chaturvedi, Snigdha
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5277 - 5294
  • [27] Emotion-Aware System for Upper Extremity Rehabilitation
    Mihelj, Matjaz
    Novak, Domen
    Munih, Marko
    2009 VIRTUAL REHABILITATION INTERNATIONAL CONFERENCE, 2009, : 160 - 165
  • [28] Emotion-aware system design for the battlefield environment
    Lin, Kai
    Xia, Fuzhen
    Li, Chensi
    Wang, Di
    Humar, Iztok
    INFORMATION FUSION, 2019, 47 : 102 - 110
  • [29] Emotion-aware intelligent environments: A user perspective
    Montalban, Iraitz
    Garzo, Ainara
    Leon, Enrique
    INTELLIGENT ENVIRONMENTS 2009, 2009, 2 : 421 - +
  • [30] Emotion-Aware Chatbots: Understanding, Reacting and Adapting to Human Emotions in Text Conversations
    Kossack, Philip
    Unger, Herwig
    ADVANCES IN REAL-TIME AND AUTONOMOUS SYSTEMS, 2023, 2024, 1009 : 158 - 175