Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUs

被引:0
|
作者
Sato, Yukina [1 ]
Amesaka, Takashi [1 ]
Yamamoto, Takumi [1 ]
Watanabe, Hiroki [2 ]
Sugiura, Yuta [1 ]
机构
[1] Keio University, Yokohama City, Japan
[2] Future University Hakodate, Hakodate, Japan
关键词
D O I
10.1145/3676503
中图分类号
学科分类号
摘要
Hearables are highly functional earphone-type wearables; however, existing input methods using stand-alone hearables are limited in the number of commands, and there is a need to extend device operation through hand gestures. In previous research on hearables for hand input, user understanding and gesture recognition systems have been developed. However, in the realm of user understanding, investigation concerning hand input with hearables remains incomplete, and existing recognition systems have not demonstrated proficiency in discerning user-defined gestures. In this study, we conducted a gesture elicitation study (GES) assuming hand input using hearables under six conditions (three interaction areas x two device shapes). Then, we extracted ear-level gestures that the device's built-in IMU sensor could recognize from the user-defined gestures and investigated the recognition performance. The results of sitting experiments showed that the gesture recognition rate for in-ear devices was 91.0% and that for ear-hook devices was 74.7%. © 2024 Owner/Author.
引用
收藏
相关论文
共 47 条
  • [11] User-Defined Gestures with Physical Props in Virtual Reality
    Moran-Ledesma M.
    Schneider O.
    Hancock M.
    Proceedings of the ACM on Human-Computer Interaction, 2021, 5 (ISS)
  • [12] Eliciting User-defined Zenithal Gestures for Privacy Preferences
    Martinez-Ruiz, Francisco J.
    Villarreal-Narvaez, Santiago
    HUCAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 2: HUCAPP, 2021, : 205 - 213
  • [13] User-Defined Body Gestures for an Interactive Storytelling Scenario
    Kistler, Felix
    Andre, Elisabeth
    HUMAN-COMPUTER INTERACTION - INTERACT 2013, PT II, 2013, 8118 : 264 - 281
  • [14] User-defined gestures for mediated social touch on touchscreens
    Wei Q.
    Hu J.
    Li M.
    Personal and Ubiquitous Computing, 2023, 27 (02) : 271 - 286
  • [15] Exploring the Design Space of Gestural Interaction with Active Tokens through User-Defined Gestures
    Valdes, Consuelo
    Eastman, Diana
    Grote, Casey
    Thatte, Shantanu
    Shaer, Orit
    Mazalek, Ali
    Ullmer, Brygg
    Konkel, Miriam K.
    32ND ANNUAL ACM CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI 2014), 2014, : 4107 - 4116
  • [16] Theoretically-Defined vs. User-Defined Squeeze Gestures
    Villarreal-Narvaez S.
    Sluyters A.
    Vanderdonckt J.
    Luzayisu E.M.
    Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (ISS): : 73 - 102
  • [17] A Framework for User-Defined Body Gestures to Control a Humanoid Robot
    Obaid, Mohammad
    Kistler, Felix
    Haring, Markus
    Buhling, Rene
    Andre, Elisabeth
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2014, 6 (03) : 383 - 396
  • [18] User-defined walking-in-place gestures for VR locomotion
    Kim, Woojoo
    Xiong, Shuping
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2021, 152
  • [19] User-Defined Gestures for Dual-Screen Mobile Interaction
    Wu, Huiyue
    Yang, Liuqingqing
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2020, 36 (10) : 978 - 992
  • [20] User-defined gestures for controlling primitive motions of an end effector
    Wongphati, Mahisorn
    Osawa, Hirotaka
    Imai, Michita
    ADVANCED ROBOTICS, 2015, 29 (04) : 225 - 238