Pilot Study on Interaction with Wide Area Motion Imagery Comparing Gaze Input and Mouse Input

被引:0
|
作者
Hild, Jutta [1 ]
Krueger, Wolfgang [1 ]
Holzbach, Gerrit [1 ]
Voit, Michael [1 ]
Peinsipp-Byma, Elisabeth [1 ]
机构
[1] Fraunhofer Inst Optron Syst Technol & Image Explo, D-76131 Karlsruhe, Germany
关键词
Aerial image analysis; Wide Area Motion Imagery; multi-object tracking; automated image analysis; user interface; multimodal gaze input; gaze pointing; pilot study;
D O I
10.1007/978-3-031-35132-7_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent sensor development allows capturing Wide Area Motion Imagery (WAMI) covering several square kilometers including a vast number of tiny moving vehicles and persons. In this situation, human interactive image exploitation is exhaustive and requires support by automated image exploitation like multi-object tracking (MOT). MOT provides object detections supporting finding small moving objects; moreover, MOT provides object tracks supporting if an object has to be identified because of its moving behavior. As WAMI and MOT are current research topics, we aim to get first insight in interaction with both. We introduce an experimental system comprising typical system functions for image exploitation and for interaction with object detections and object tracks. The system provides two input concepts. One utilizes a computer mouse and a keyboard for system input. The other utilizes a remote eye-tracker and a keyboard; as in prior work, gaze-based selection of moving objects in Full Motion Video (FMV) appeared as an efficient and manually less stressful input alternative to mouse input. We introduce five task types that might occur in practical visual WAMI exploitation. In a pilot study (N = 12; all non-expert image analysts), we compare gaze input and mouse input for those five task types. The results show, that both input concepts allow similar user performance concerning error rates, completion time, and perceived workload (NASA-TLX). Most features of user satisfaction (ISO 9241-411 questionnaire) were rated similar as well, except general comfort being better for gaze input and eye fatigue being better for mouse input.
引用
收藏
页码:352 / 369
页数:18
相关论文
共 50 条
  • [1] Comparing Pointing Performance of Mouse and Eye-Gaze Input System
    Guo, Wenbin
    Kim, Jung Hyup
    UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: DESIGNING NOVEL INTERACTIONS, PT II, 2017, 10278 : 417 - 429
  • [2] Comparing Input Error for Mouse and Touch Input
    Hooten, Eli R.
    Adams, Julie A.
    2011 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2011, : 2853 - 2858
  • [3] Improving the Accuracy of Gaze Input for Interaction
    Kumar, Manu
    Klingner, Jeff
    Puranik, Rohan
    Winograd, Terry
    Paepcke, Andreas
    PROCEEDINGS OF THE EYE TRACKING RESEARCH AND APPLICATIONS SYMPOSIUM (ETRA 2008), 2008, : 65 - 68
  • [4] Evaluation of the Potential of Gaze Input for Game Interaction
    San Agustin, Javier
    Mateo, Julio C.
    Hansen, John Paulin
    Villanueva, Arantxa
    PSYCHNOLOGY JOURNAL, 2009, 7 (02): : 213 - 235
  • [5] Wide Area Motion Imagery Tracking
    Vasquez, Juan R.
    Fogle, Ryan
    Salva, Karl
    EVOLUTIONARY AND BIO-INSPIRED COMPUTATION: THEORY AND APPLICATIONS VI, 2012, 8402
  • [6] Study on Character Input Methods using Eye-gaze Input Interface
    Murata, Atsuo
    Hayashi, Kazuya
    Moriwaka, Makoto
    Hayami, Takehito
    2012 PROCEEDINGS OF SICE ANNUAL CONFERENCE (SICE), 2012, : 1402 - 1407
  • [7] Study on Spatiotemporal Characteristics of Gaze Gesture Input
    Hou, Wen-jun
    Wu, Si-qi
    Chen, Xiao-lin
    Chen, Kai-xiang
    HUMAN-COMPUTER INTERACTION. RECOGNITION AND INTERACTION TECHNOLOGIES, HCI 2019, PT II, 2019, 11567 : 283 - 302
  • [8] Proposal of eye-gaze character input interface with guide area
    Ban, H
    Sugamata, I
    Itakura, N
    Sakamoto, K
    Kitamoto, T
    ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2003, 86 (10): : 36 - 42
  • [9] Comparing Input Modalities for Peripheral Interaction: A Case Study on Peripheral Music Control
    Hausen, Doris
    Richter, Hendrik
    Hemme, Adalie
    Butz, Andreas
    HUMAN-COMPUTER INTERACTION - INTERACT 2013, PT III, 2013, 8119 : 162 - 179
  • [10] A support system for mouse operations using eye-gaze input
    Kiyohiko, Abe
    Yasuhiro, Nakayama
    Shoichi, Ohi
    Minoru, Ohyama
    IEEJ Transactions on Electronics, Information and Systems, 2009, 129 (09) : 1705 - 1713