Attention-aware semantic relevance predicting Chinese sentence reading

被引:0
|
作者
Sun, Kun [1 ,2 ,4 ]
Liu, Haitao [3 ]
机构
[1] Univ Tubingen, Dept Linguist, Tubingen, Germany
[2] Tongji Univ, Coll Foreign Languages, Shanghai, Peoples R China
[3] Fudan Univ, Coll Foreign Languages & Literature, Shanghai, Peoples R China
[4] Dept Linguist, Wilhelmstr 19, Tubingen, Germany
关键词
Attention mechanism; Contextual information; Interpretability; Reading duration; Preview benefits; EYE-MOVEMENTS; PREVIEW BENEFIT; PARAFOVEAL PREVIEW; BOTTOM-UP; TOP-DOWN; PREDICTABILITY; CHARACTERS; LANGUAGE; MODELS; WORDS;
D O I
10.1016/j.cognition.2024.105991
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
In recent years, several influential computational models and metrics have been proposed to predict how humans comprehend and process sentence. One particularly promising approach is contextual semantic similarity. Inspired by the attention algorithm in Transformer and human memory mechanisms, this study proposes an "attention-aware"approach for computing contextual semantic relevance. This new approach takes into account the different contributions of contextual parts and the expectation effect, allowing it to incorporate contextual information fully. The attention-aware approach also facilitates the simulation of existing reading models and their evaluation. The resulting "attention-aware"metrics of semantic relevance can more accurately predict fixation durations in Chinese reading tasks recorded in an eye-tracking corpus than those calculated by existing approaches. The study's findings further provide strong support for the presence of semantic preview benefits in Chinese naturalistic reading. Furthermore, the attention-aware metrics of semantic relevance, being memory-based, possess high interpretability from both linguistic and cognitive standpoints, making them a valuable computational tool for modeling eye-movements in reading and further gaining insight into the process of language comprehension. Our approach emphasizes the potential of these metrics to advance our understanding of how humans comprehend and process language.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Learning graph attention-aware knowledge graph embedding
    Li, Chen
    Peng, Xutan
    Niu, Yuhang
    Zhang, Shanghang
    Peng, Hao
    Zhou, Chuan
    Li, Jianxin
    NEUROCOMPUTING, 2021, 461 : 516 - 529
  • [32] Intelligible graph contrastive learning with attention-aware for recommendation
    Mo, Xian
    Zhao, Zihang
    He, Xiaoru
    Qi, Hang
    Liu, Hao
    NEUROCOMPUTING, 2025, 614
  • [33] Attention-aware invertible hashing network with skip connections
    Li, Shanshan
    Cai, Qiang
    Li, Zhuangzi
    Li, Haisheng
    Zhang, Naiguang
    Zhang, Xiaoyu
    PATTERN RECOGNITION LETTERS, 2020, 138 : 556 - 562
  • [34] HANet: Hybrid Attention-aware Network for Crowd Counting
    Su, Xinxing
    Yuan, Yuchen
    Su, Xiangbo
    Zou, Zhikang
    Wen, Shilei
    Zhou, Pan
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7707 - 7714
  • [35] Towards attention-aware adaptive notification on smart phones
    Okoshi, Tadashi
    Nozaki, Hiroki
    Nakazawa, Jin
    Tokuda, Hideyuki
    Ramos, Julian
    Dey, Anind K.
    PERVASIVE AND MOBILE COMPUTING, 2016, 26 : 17 - 34
  • [36] Lightweight Contrast Modeling for Attention-Aware Visual Localization
    Huang, Lili
    Li, Guanbin
    Li, Ya
    Lin, Liang
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 2104 - 2110
  • [37] Local-to-Global Cross-Modal Attention-Aware Fusion for HSI-X Semantic Segmentation
    Zhang, Xuming
    Yokoya, Naoto
    Gu, Xingfa
    Tian, Qingjiu
    Bruzzone, Lorenzo
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [38] Attention-aware Deep Reinforcement Learning for Video Face Recognition
    Rao, Yongming
    Lu, Jiwen
    Zhou, Jie
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3951 - 3960
  • [39] Attention-aware graph contrastive learning with topological relationship for recommendation
    Mo, Xian
    Pang, Jun
    Zhao, Zihang
    APPLIED SOFT COMPUTING, 2025, 174
  • [40] Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension
    Mihaylov, Todor
    Frank, Anette
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 2541 - 2552