Entropy-based guidance and predictive modelling of pedestrians' visual attention in urban environment

被引:1
|
作者
Xie, Qixu [1 ,2 ]
Zhang, Li [1 ,2 ]
机构
[1] Tsinghua Univ, Sch Architecture, Dept Architecture, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Sch Architecture, Urban Ergon Lab, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
visual attention; pedestrian; eye-tracking; local entropy; deep learning; urban ergonomics; GAZE; RESPONSES;
D O I
10.1007/s12273-024-1165-y
中图分类号
O414.1 [热力学];
学科分类号
摘要
Selective visual attention determines what pedestrians notice and ignore in urban environment. If consistency exists between different individuals' visual attention, designers can modify design by underlining mechanisms to better meet user needs. However, the mechanism of pedestrians' visual attention remains poorly understood, and it is challenging to forecast which position will attract pedestrians more in urban environment. To address this gap, we employed 360 degrees video and immersive virtual reality to simulate walking scenarios and record eye movement in 138 participants. Our findings reveal a remarkable consistency in fixation distribution across individuals, exceeding both chance and orientation bias. One driver of this consistency emerges as a strategy of information maximization, with participants tending to fixate areas of higher local entropy. Additionally, we built the first eye movement dataset for panorama videos of diverse urban walking scenes, and developed a predictive model to forecast pedestrians' visual attention by supervised deep learning. The predictive model aids designers in better understanding how pedestrians will visually interact with the urban environment during the design phase. The dataset and code of predictive model are available at https://github.com/LiamXie/UrbanVisualAttention
引用
收藏
页码:1659 / 1674
页数:16
相关论文
共 50 条
  • [1] Entropy-based Visual Homing
    Kim, Piljae
    Szenher, Matthew D.
    Webb, Barbara
    2009 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION, VOLS 1-7, CONFERENCE PROCEEDINGS, 2009, : 3601 - +
  • [2] Entropy-Based Visual Servoing
    Dame, Amaury
    Marchand, Eric
    ICRA: 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-7, 2009, : 1970 - +
  • [3] Entropy-based Visual Tree Evaluation on Block Extraction
    Cho, Wei-Ting
    Lin, Yu-Min
    Kao, Hung-Yu
    2009 IEEE/WIC/ACM INTERNATIONAL JOINT CONFERENCES ON WEB INTELLIGENCE (WI) AND INTELLIGENT AGENT TECHNOLOGIES (IAT), VOL 1, 2009, : 580 - 583
  • [4] Entropy-based environment exploration and stochastic optimal control
    Baglietto, M
    Paolucci, M
    Scardovi, L
    Zoppoli, R
    42ND IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-6, PROCEEDINGS, 2003, : 2938 - 2941
  • [5] A computational model of visual attention based on visual entropy
    Dou, Yan
    Kong, Lingfu
    Wang, Liufeng
    Guangxue Xuebao/Acta Optica Sinica, 2009, 29 (09): : 2511 - 2515
  • [6] Testing entropy-based search strategies for a visual classification task
    Liliya Avdiyenko
    Nils Bertschingen
    Juergen Jost
    BMC Neuroscience, 13 (Suppl 1)
  • [7] Entropy-based analysis of urban residential district sustainable development
    Jiang-hong, Chen
    Qi-ming, Li
    PROCEEDINGS OF CRIOCM 2007 INTERNATIONAL RESEARCH SYMPOSIUM ON ADVANCEMENT OF CONSTRUCTION MANAGEMENT AND REAL ESTATE, VOLS 1 AND 2, 2007, : 355 - 362
  • [8] The Socialization of Visual Attention: Training Effects of Verbal Attention Guidance in Urban German Children
    Jurkat, Solveig
    Gutknecht-Stoehr, Amelie Charlotte
    Kaertner, Joscha
    DEVELOPMENTAL PSYCHOLOGY, 2024,
  • [9] Modelling honeybee visual guidance in a 3-D environment
    Portelli, G.
    Serres, J.
    Ruffier, F.
    Franceschini, N.
    JOURNAL OF PHYSIOLOGY-PARIS, 2010, 104 (1-2) : 27 - 39
  • [10] Modelling visual attention: Putting a saliency model of eye guidance to a test
    Koesling, H.
    Friesen, R.
    Hammer, S.
    Lier, F.
    Preuss, T.
    PERCEPTION, 2009, 38 : 182 - 182