A new gaze-based interface for environmental control

被引:0
|
作者
Shi, Fangmin [1 ]
Gale, Alastair [1 ]
Purdy, Kevin [1 ]
机构
[1] Univ Loughborough, Appl Vis Res Ctr, Loughborough LE11 3TU, Leics, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes a new control system interface which utilises the user's eye gaze to enable severely disabled individuals control electronic devices easily. The system is based upon a novel human computer interface, which facilitates simple control of electronic devices by predicting and responding to the user's possible intentions, based intuitively upon their point of gaze. The interface responds by automatically pre-selecting and offering only those controls appropriate to the specific device that the user looks at, in a simple and accessible manner. It therefore affords the user conscious choice of the appropriate range of control actions required, which may be executed by simple means and without the need to navigate manually through potentially complex control menus to reach them. Two systems using the head-mounted and the remote eye tracker respectively are introduced, compared and evaluated in this paper.
引用
收藏
页码:996 / +
页数:2
相关论文
共 50 条
  • [11] Gaze-based Object Detection in the Wild
    Weber, Daniel
    Fuhl, Wolfgang
    Zell, Andreas
    Kasneci, Enkelejda
    2022 SIXTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC, 2022, : 62 - 66
  • [12] Gaze-based Interaction for Virtual Environments
    Jimenez, Jorge
    Gutierrez, Diego
    Latorre, Pedro
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2008, 14 (19) : 3085 - 3098
  • [13] Gaze-based Assessment of Expertise in Chess
    Fuhl, Wolfgang
    Hyseni, Gazmend
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [14] Gaze-Based Interaction for VR Environments
    Piotrowski, Patryk
    Nowosielski, Adam
    IMAGE PROCESSING AND COMMUNICATIONS: TECHNIQUES, ALGORITHMS AND APPLICATIONS, 2020, 1062 : 41 - 48
  • [15] EyeNav: Gaze-Based Code Navigation
    Radevski, Stevche
    Hata, Hideaki
    Matsumoto, Kenichi
    PROCEEDINGS OF THE NORDICHI '16: THE 9TH NORDIC CONFERENCE ON HUMAN-COMPUTER INTERACTION - GAME CHANGING DESIGN, 2016,
  • [16] Towards Gaze-based Video Annotation
    Soliman, Mohamed
    R-Tavakoli, Hamed
    Laaksonen, Jorma
    2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [17] Gaze-Based Annotations for Reading Comprehension
    Cheng, Shiwei
    Sun, Zhiqiang
    Sun, Lingyun
    Yee, Kirsten
    Dey, Anind K.
    CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, : 1569 - 1572
  • [18] An Adaptive Model of Gaze-based Selection
    Chen, Xiuli
    Acharya, Aditya
    Oulasvirta, Antti
    Howes, Andrew
    CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
  • [19] A Scrolling Approach for Gaze-Based Interaction
    Schniederjann, Florian
    Korthing, Lars
    Broesterhaus, Jonas
    Mertens, Robert
    2019 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2019), 2019, : 233 - 234
  • [20] Gaze-based interactions in the cockpit of the future: a survey
    Rudi, David
    Kiefer, Peter
    Giannopoulos, Ioannis
    Raubal, Martin
    JOURNAL ON MULTIMODAL USER INTERFACES, 2020, 14 (01) : 25 - 48