Looking in Depth: Targeting by Eye and Controller Input for Multi-Depth Target Placement

被引:0
|
作者
Fernandes, Ajoy S. [1 ]
Murdison, T. Scott [2 ]
Proulx, Michael J. [1 ]
机构
[1] Meta Real Labs Res, Redmond, WA 98052 USA
[2] Meta Real Labs, Redmond, WA USA
关键词
Eye tracking; user experience; input devices; 3D user interaction; human factors and ergonomics; gaze targeting; HAND COORDINATION; GAZE-SHIFTS; INFORMATION; VERGENCE; TRACKING; SYSTEM; TRANSFORMATIONS; DISTANCE; ANOVA;
D O I
10.1080/10447318.2024.2401657
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We explored how interaction performance is affected by multi-depth VR targeting and button selection using two targeting methods: eye tracking with no UX modifications and feedback, or the controller with a visible cursor for targeting. Selections happened on a controller button press for both targeting modalities. Targets had a diameter of either 3, 4, or 5 degrees, placed in depths between 0.3 m-5m. When comparing conditions of a 1 m single depth vs. multi-depth environment, the eyes were less affected by depth than the controller. We found that performance decreased in multi-depth scenarios on targeting and selection for the controller as measured by Throughput (22% decrease), Movement Time (31% increase), and Misses (66% increase). Depth also affected eye tracking significantly, but to a lesser degree, for Throughput (4% decrease) and Movement Time (6% increase) but not Misses (5% increase). The eyes outperformed the controller in multi-depth scenarios, as measured by Throughput (2.86 bits/s vs. 2.56 bits/s), and were similar in Movement Time (1.10s vs. 1.10s) but had the most Misses (21% vs. 9%). Our study also shows that selecting consecutive targets that come closer to the user is more difficult than those that diverge away from the user, and that targets with larger depth distances take longer to select. Overall, this study provides further supporting evidence that eye tracking can play an important role in 3D interactions.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Computational multi-depth single-photon imaging
    Shin, Dongeek
    Xu, Feihu
    Wong, Franco N. C.
    Shapiro, Jeffrey H.
    Goyal, Vivek K.
    OPTICS EXPRESS, 2016, 24 (03): : 1873 - 1888
  • [22] Modeling of multi-depth slanted airgun source for deghosting
    Hong-Lei Shen
    Thomas Elboth
    Gang Tian
    Zhi Lin
    Applied Geophysics, 2014, 11 : 405 - 417
  • [23] A multi-depth convolutional neural network for SAR image classification
    Xia, Jingfan
    Yang, Xuezhi
    Jia, Lu
    REMOTE SENSING LETTERS, 2018, 9 (12) : 1138 - 1147
  • [24] Novel multi-depth microfluidic chip for single cell analysis
    Sun, Yue
    Yin, Xue-Feng
    JOURNAL OF CHROMATOGRAPHY A, 2006, 1117 (02) : 228 - 233
  • [25] Multi-depth Graph Convolutional Networks for Fake News Detection
    Hu, Guoyong
    Ding, Ye
    Qi, Shuhan
    Wang, Xuan
    Liao, Qing
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 698 - 710
  • [26] Terahertz dispersion using multi-depth phase modulation grating
    Yang, Qiu-jie
    Huang, Jing-guo
    Xiao, Zhen-yang
    Huang, Zhi-ming
    Shu, Rong
    He, Zhi-ping
    OPTICS EXPRESS, 2019, 27 (09) : 12732 - 12747
  • [27] Laser display system for multi-depth screen projection scenarios
    La Torre, J. Pablo
    Mayes, Nathan
    Riza, Nabeel A.
    APPLIED OPTICS, 2017, 56 (32) : 9023 - 9029
  • [28] Chromatic confocal microscopy for multi-depth imaging of epithelial tissue
    Olsovsky, Cory
    Shelton, Ryan
    Carrasco-Zevallos, Oscar
    Applegate, Brian E.
    Maitland, Kristen C.
    BIOMEDICAL OPTICS EXPRESS, 2013, 4 (05): : 732 - 740
  • [29] A Multi-Depth Camera Capture System for Point Cloud Library
    Rogge, Stephan
    Hentschel, Christian
    2014 IEEE Fourth International Conference on Consumer Electronics Berlin (ICCE-Berlin), 2014, : 50 - 58
  • [30] Automatization of multi-depth high-density storage system
    Vujanac, Rodoljub
    Slavkovic, Radovan
    Miloradovic, Nenad
    Metalurgia International, 2013, 18 (08): : 49 - 55