Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUs

被引:0
|
作者
Sato, Yukina [1 ]
Amesaka, Takashi [1 ]
Yamamoto, Takumi [1 ]
Watanabe, Hiroki [2 ]
Sugiura, Yuta [1 ]
机构
[1] Keio University, Yokohama City, Japan
[2] Future University Hakodate, Hakodate, Japan
关键词
D O I
10.1145/3676503
中图分类号
学科分类号
摘要
Hearables are highly functional earphone-type wearables; however, existing input methods using stand-alone hearables are limited in the number of commands, and there is a need to extend device operation through hand gestures. In previous research on hearables for hand input, user understanding and gesture recognition systems have been developed. However, in the realm of user understanding, investigation concerning hand input with hearables remains incomplete, and existing recognition systems have not demonstrated proficiency in discerning user-defined gestures. In this study, we conducted a gesture elicitation study (GES) assuming hand input using hearables under six conditions (three interaction areas x two device shapes). Then, we extracted ear-level gestures that the device's built-in IMU sensor could recognize from the user-defined gestures and investigated the recognition performance. The results of sitting experiments showed that the gesture recognition rate for in-ear devices was 91.0% and that for ear-hook devices was 74.7%. © 2024 Owner/Author.
引用
收藏
相关论文
共 47 条
  • [1] Exploring user-defined gestures for lingual and palatal interaction
    Villarreal-Narvaez, Santiago
    Perez-Medina, Jorge Luis
    Vanderdonckt, Jean
    JOURNAL ON MULTIMODAL USER INTERFACES, 2023, 17 (03) : 167 - 185
  • [2] Exploring user-defined gestures for lingual and palatal interaction
    Santiago Villarreal-Narvaez
    Jorge Luis Perez-Medina
    Jean Vanderdonckt
    Journal on Multimodal User Interfaces, 2023, 17 : 167 - 185
  • [3] Exploring User-Defined Gestures to Control a Group of Four UAVs
    Peshkova, Ekaterina
    Hitz, Martin
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 169 - 174
  • [4] User-Defined Gestures for Augmented Reality
    Piumsomboon, Thammathip
    Clark, Adrian
    Billinghurst, Mark
    Cockburn, Andy
    HUMAN-COMPUTER INTERACTION - INTERACT 2013, PT II, 2013, 8118 : 282 - 299
  • [5] User-Defined Gestures for Surface Computing
    Wobbrock, Jacob O.
    Morris, Meredith Ringel
    Wilson, Andrew D.
    CHI2009: PROCEEDINGS OF THE 27TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, VOLS 1-4, 2009, : 1083 - 1092
  • [6] Exploring User Defined Gestures for Ear-Based Interactions
    Chen Y.-C.
    Liao C.-Y.
    Hsu S.-W.
    Huang D.-Y.
    Chen B.-Y.
    Proceedings of the ACM on Human-Computer Interaction, 2020, 4 (ISS)
  • [7] Replicating User-defined Gestures for Text Editing
    Sukumar, Poorna Talkad
    Liu, Anqing
    Metoyer, Ronald
    PROCEEDINGS OF THE 2018 ACM INTERNATIONAL CONFERENCE ON INTERACTIVE SURFACES AND SPACES (ISS'18), 2018, : 97 - 106
  • [8] User-Defined Gestures for Elastic, Deformable Displays
    Troiano, Giovanni Maria
    Pedersen, Esben Warming
    Hornbaek, Kasper
    PROCEEDINGS OF THE 2014 INTERNATIONAL WORKING CONFERENCE ON ADVANCED VISUAL INTERFACES, AVI 2014, 2014, : 1 - 8
  • [9] User-Defined Motion Gestures for Mobile Interaction
    Ruiz, Jaime
    Li, Yang
    Lank, Edward
    29TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2011, : 197 - 206
  • [10] Exploring User-Defined Gestures and Voice Commands to Control an Unmanned Aerial Vehicle
    Peshkova, Ekaterina
    Hitz, Martin
    Ahlstrom, David
    INTELLIGENT TECHNOLOGIES FOR INTERACTIVE ENTERTAINMENT, INTETAIN 2016, 2017, 178 : 47 - 62