Looking in Depth: Targeting by Eye and Controller Input for Multi-Depth Target Placement

被引:0
|
作者
Fernandes, Ajoy S. [1 ]
Murdison, T. Scott [2 ]
Proulx, Michael J. [1 ]
机构
[1] Meta Real Labs Res, Redmond, WA 98052 USA
[2] Meta Real Labs, Redmond, WA USA
关键词
Eye tracking; user experience; input devices; 3D user interaction; human factors and ergonomics; gaze targeting; HAND COORDINATION; GAZE-SHIFTS; INFORMATION; VERGENCE; TRACKING; SYSTEM; TRANSFORMATIONS; DISTANCE; ANOVA;
D O I
10.1080/10447318.2024.2401657
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We explored how interaction performance is affected by multi-depth VR targeting and button selection using two targeting methods: eye tracking with no UX modifications and feedback, or the controller with a visible cursor for targeting. Selections happened on a controller button press for both targeting modalities. Targets had a diameter of either 3, 4, or 5 degrees, placed in depths between 0.3 m-5m. When comparing conditions of a 1 m single depth vs. multi-depth environment, the eyes were less affected by depth than the controller. We found that performance decreased in multi-depth scenarios on targeting and selection for the controller as measured by Throughput (22% decrease), Movement Time (31% increase), and Misses (66% increase). Depth also affected eye tracking significantly, but to a lesser degree, for Throughput (4% decrease) and Movement Time (6% increase) but not Misses (5% increase). The eyes outperformed the controller in multi-depth scenarios, as measured by Throughput (2.86 bits/s vs. 2.56 bits/s), and were similar in Movement Time (1.10s vs. 1.10s) but had the most Misses (21% vs. 9%). Our study also shows that selecting consecutive targets that come closer to the user is more difficult than those that diverge away from the user, and that targets with larger depth distances take longer to select. Overall, this study provides further supporting evidence that eye tracking can play an important role in 3D interactions.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] based on a multi-depth output network
    Sang, Qingbing
    Su, Chenfei
    Zhu, Lingying
    Liu, Lixiong
    Wu, Xiaojun
    Bovik, Alan C.
    JOURNAL OF ELECTRONIC IMAGING, 2021, 30 (04)
  • [2] MULTI-DEPTH SOIL GAS SURVEYING
    ZDEB, TF
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 1986, 192 : 100 - ENVR
  • [3] Optical cryptography with biometrics for multi-depth objects
    Aimin Yan
    Yang Wei
    Zhijuan Hu
    Jingtao Zhang
    Peter Wai Ming Tsang
    Ting-Chung Poon
    Scientific Reports, 7
  • [4] Multi-Depth Fractionated Aesthetic Ultrasound Surgery
    Slayton, Michael H.
    Lyke, Stephanie
    Barthe, Peter G.
    PROCEEDINGS FROM THE 13TH INTERNATIONAL SYMPOSIUM ON THERAPEUTIC ULTRASOUND, 2017, 1816
  • [5] Multi-depth valved microfluidics for biofilm segmentation
    Meyer, M. T.
    Subramanian, S.
    Kim, Y. W.
    Ben-Yoav, H.
    Gnerlich, M.
    Gerasopoulos, K.
    Bentley, W. E.
    Ghodssi, R.
    JOURNAL OF MICROMECHANICS AND MICROENGINEERING, 2015, 25 (09)
  • [6] Depth of field extended integral imaging based on multi-depth fitting fusion
    Yang, Le
    Liu, Li
    OPTICS COMMUNICATIONS, 2024, 555
  • [7] MULTI-DEPTH COMPUTATIONAL PERISCOPY WITH AN ORDINARY CAMERA
    Saunders, Charles
    Bose, Rishabh
    Murray-Bruce, John
    Goyal, Vivek K.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 9299 - 9303
  • [8] Optical cryptography with biometrics for multi-depth objects
    Yan, Aimin
    Wei, Yang
    Hu, Zhijuan
    Zhang, Jingtao
    Tsang, Peter Wai Ming
    Poon, Ting-Chung
    SCIENTIFIC REPORTS, 2017, 7
  • [9] A multi-Depth Sensorised micro Sampling System
    Saviozzi, Giacomo
    Buselli, Elisa
    Stefanini, Cesare
    Laschi, Cecilia
    Dario, Paolo
    OCEANS 2015 - GENOVA, 2015,
  • [10] Single exposition holography for multi-depth objects
    Herran Cuspinera, Roxana Maria
    Olivares-Perez, Arturo
    Manuel Villa-Hernandez, Joan
    Vallejo-Mendoza, Rosaura
    PRACTICAL HOLOGRAPHY XXXIV: DISPLAYS, MATERIALS, AND APPLICATIONS, 2020, 11306