Eliciting Multimodal Gesture plus Speech Interactions in a Multi-Object Augmented Reality Environment

被引:4
|
作者
Zhou, Xiaoyan [1 ]
Williams, Adam S. [1 ]
Ortega, Francisco R. [1 ]
机构
[1] Colorado State Univ, Ft Collins, CO USA
基金
美国国家科学基金会;
关键词
elicitation; multimodal interaction; augmented reality; gesture and speech interaction; multi-object AR environment;
D O I
10.1145/3562939.3565637
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
As augmented reality (AR) technology and hardware become more mature and affordable, researchers have been exploring more intuitive and discoverable interaction techniques for immersive environments. This paper investigates multimodal interaction for 3D object manipulation in a multi-object AR environment. To identify the user-defined gestures, we conducted an elicitation study involving 24 participants and 22 referents using an augmented reality headset. It yielded 528 proposals and generated a winning gesture set with 25 gestures after binning and ranking all gesture proposals. We found that for the same task, the same gesture was preferred for both one and two-object manipulation, although both hands were used in the two-object scenario. We present the gestures and speech results, and the differences compared to similar studies in a single object AR environment. The study also explored the association between speech expressions and gesture stroke during object manipulation, which could improve the recognizer efficiency in augmented reality headsets.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Using Hand Gesture and Speech in a Multimodal Augmented Reality Environment
    Dias, Miguel Sales
    Bastos, Rafael
    Fernandes, Joao
    Tavares, Joao
    Santos, Pedro
    GESTURE-BASED HUMAN-COMPUTER INTERACTION AND SIMULATION, 2009, 5085 : 175 - +
  • [2] Multimodal Fusion: Gesture and Speech Input in Augmented Reality Environment
    Ismail, Ajune Wanis
    Sunar, Mohd Shahrizal
    COMPUTATIONAL INTELLIGENCE IN INFORMATION SYSTEMS, 2015, 331 : 245 - 254
  • [3] Understanding Multimodal User Gesture and Speech Behavior for Object Manipulation in Augmented Reality Using Elicitation
    Williams, Adam S.
    Garcia, Jason
    Ortega, Francisco
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2020, 26 (12) : 3479 - 3489
  • [4] Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained Elicitation
    Williams A.S.
    Ortega F.R.
    Proceedings of the ACM on Human-Computer Interaction, 2020, 4 (ISS)
  • [5] A Multi-Object Oriented Iterative Closest Point Algorithm in Augmented Reality
    Wang, Zhenhao
    Zhao, Yan
    Wang, Shigang
    ADVANCES IN DISPLAY TECHNOLOGIES VIII, 2018, 10556
  • [6] Understanding Multimodal User Gesture and Speech Behavior for Object Manipulation in Augmented Reality Using Elicitation (vol 26, pg 3479, 2020)
    Williams, Adam S.
    Garcia, Jason
    Ortega, Francisco
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2022, 28 (07) : 2808 - 2808
  • [7] Hand Gesture Control of Virtual Object in Augmented Reality
    Rani, Siji S.
    Dhrisya, K. J.
    Ahalyadas, M.
    2017 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2017, : 1500 - 1505
  • [8] Feature Compression for Multimodal Multi-Object Tracking
    Li, Xinlin
    Hanna, Osama A.
    Fragouli, Christina
    Diggavi, Suhas
    Verma, Gunjan
    Bhattacharyya, Joydeep
    MILCOM 2023 - 2023 IEEE MILITARY COMMUNICATIONS CONFERENCE, 2023,
  • [9] TSDF plus plus : A Multi-Object Formulation for Dynamic Object Tracking and Reconstruction
    Grinvald, Margarita
    Tombari, Federico
    Siegwart, Roland
    Nieto, Juan
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 14192 - 14198
  • [10] Designing an Augmented Reality Multimodal Interface for 6DOF Manipulation Techniques Multimodal Fusion Using Gesture and Speech Input for AR
    Ismail, Ajune Wanis
    Billinghurst, Mark
    Sunar, Mohd Shahrizal
    Yusof, Cik Suhaimi
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 1, 2019, 868 : 309 - 322