User-Defined Gestures with Physical Props in Virtual Reality

被引:4
|
作者
Moran-Ledesma M. [1 ]
Schneider O. [2 ]
Hancock M. [2 ]
机构
[1] Systems Design Engineering, University of Waterloo, Waterloo, ON
[2] Management Sciences, University of Waterloo, Waterloo, ON
基金
加拿大自然科学与工程研究理事会;
关键词
3d physical props; agreement score; elicitation technique; gestural input; similarity measures; immersive interaction; virtual reality;
D O I
10.1145/3486954
中图分类号
学科分类号
摘要
When interacting with virtual reality (VR) applications like CAD and open-world games, people may want to use gestures as a means of leveraging their knowledge from the physical world. However, people may prefer physical props over handheld controllers to input gestures in VR. We present an elicitation study where 21 participants chose from 95 props to perform manipulative gestures for 20 CAD-like and open-world game-like referents. When analyzing this data, we found existing methods for elicitation studies were insufficient to describe gestures with props, or to measure agreement with prop selection (i.e., agreement between sets of items). We proceeded by describing gestures as context-free grammars, capturing how different props were used in similar roles in a given gesture. We present gesture and prop agreement scores using a generalized agreement score that we developed to compare multiple selections rather than a single selection. We found that props were selected based on their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support multiple gestures. © 2021 ACM.
引用
收藏
相关论文
共 50 条
  • [41] User-Defined Foot Gestures for Eyes-Free Interaction in Smart Shower Rooms
    Chen, Zhanming
    Tu, Huawei
    Wu, Huiyue
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2023, 39 (20) : 4139 - 4161
  • [42] Interactive Auditory Mediated Reality: Towards User-defined Personal Soundscapes
    Haas, Gabriel
    Stemasov, Evgeny
    Rietzler, Michael
    Rukzio, Enrico
    PROCEEDINGS OF THE 2020 ACM DESIGNING INTERACTIVE SYSTEMS CONFERENCE (DIS 2020), 2020, : 2035 - 2050
  • [43] USER-DEFINED BENCHMARKS HELP EVALUATE IC PHYSICAL LIBRARIES
    HARVEYHORN, H
    ELECTRONIC DESIGN, 1993, 41 (21) : 80 - &
  • [44] User-Defined Gestures for Gestural Interaction: Extending from Hands to Other Body Parts
    Chen, Zhen
    Ma, Xiaochi
    Peng, Zeya
    Zhou, Ying
    Yao, Mengge
    Ma, Zheng
    Wang, Ci
    Gao, Zaifeng
    Shen, Mowei
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2018, 34 (03) : 238 - 250
  • [45] User-defined telecooperation services
    Gruhn, V
    Herrmann, P
    Krumm, H
    1998 INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, PROCEEDINGS, 1998, : 590 - 598
  • [46] AXIOMS FOR USER-DEFINED OPERATORS
    PYLE, IC
    SOFTWARE-PRACTICE & EXPERIENCE, 1980, 10 (04): : 307 - 318
  • [47] Stable User-Defined Priorities
    Vargaftik, Shay
    Keslassy, Isaac
    Orda, Ariel
    IEEE INFOCOM 2017 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2017,
  • [48] Toward Highly Flexible Inter-User Calibration of Myoelectric Control Models With User-Defined Hand Gestures
    Yuan, Yangyang
    Chen, Zihao
    Liu, Jionghui
    Chou, Chihhong
    Dai, Chenyun
    Jiang, Xinyu
    IEEE TRANSACTIONS ON MEDICAL ROBOTICS AND BIONICS, 2025, 7 (01): : 359 - 367
  • [49] Analysis of User-Defined Radar-Based Hand Gestures Sensed Through Multiple Materials
    Sluyters, Arthur
    Lambot, Sebastien
    Vanderdonckt, Jean
    Villarreal-Narvaez, Santiago
    IEEE ACCESS, 2024, 12 : 27895 - 27917
  • [50] A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments
    Vatavu, Radu-Daniel
    JOURNAL OF AMBIENT INTELLIGENCE AND SMART ENVIRONMENTS, 2013, 5 (02) : 187 - 211