TeethFa: Real-Time, Hand-Free Teeth Gestures Interaction Using Fabric Sensors

被引:0
|
作者
Wu, Yuan [1 ,2 ]
Bai, Shoudu [1 ,2 ]
Fu, Meiqin [1 ,2 ]
Hu, Xinrong [1 ,2 ]
Zhong, Weibing [3 ]
Ding, Lei [1 ,2 ]
Chen, Yanjiao [4 ]
机构
[1] Wuhan Text Univ, Sch Comp Sci & Artificial Intelligence, Wuhan 430200, Peoples R China
[2] Wuhan Text Univ, Engn Res Ctr Hubei Prov Clothing Informat, Wuhan 430200, Peoples R China
[3] Wuhan Text Univ, Key Lab Text Fiber & Prod, Minist Educ, Wuhan 430200, Peoples R China
[4] Zhejiang Univ, Coll Elect Engn, Hangzhou 310027, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 21期
关键词
Human-machine interaction; teeth gestures recognition; wearable fabric sensors;
D O I
10.1109/JIOT.2024.3434657
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The interaction mode of smart eyewear has garnered significant research attention. Most smart eyewear relies on touchpads for user interaction. This article identifies a drawback arising from the use of touchpads, which can be obtrusive and unfriendly to users. In this article, we propose TeethFa, a novel fabric sensor-based system for recognizing teeth gestures. TeethFa serves as a hands-free interaction method for smart eyewear. TeethFa utilizes fabric sensors embedded in the glasses frame to capture pressure changes induced by facial muscle movements linked to teeth movements. This enables the identification of subtle teeth gestures. To detect teeth gestures, TeethFa designs a novel template-based signal segmentation method to determine the boundary of teeth gestures from fabric sensors, even in the presence of motion interference. To improve TeethFa's generalization, we employ a meta-learning technique based on generalization adjustment to extend the model to new users. We conduct extensive experiments to assess TeethFa's performance on 30 volunteers. The results demonstrate that our system accurately identifies five different teeth gestures with an average accuracy of 93.57%, and even for new users, the accuracy can reach 89.58%. TeethFa shows promise in offering a new interaction paradigm for smart eyewear in the future.
引用
收藏
页码:35223 / 35237
页数:15
相关论文
共 50 条
  • [1] Real-time natural hand gestures
    Yi, BF
    Harris, FC
    Wang, L
    Yan, YS
    COMPUTING IN SCIENCE & ENGINEERING, 2005, 7 (03) : 92 - 96
  • [2] Real-Time Detection of Hand Gestures
    Muzyka, Piotr
    Frydrysiak, Marek
    Roszkowska, Elzbieta
    2016 21ST INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS (MMAR), 2016, : 168 - 173
  • [3] A Framework for Real-Time Physical Human-Robot Interaction using Hand Gestures
    Mazhar, Osama
    Ramdani, Sofiane
    Navarro, Benjamin
    Passama, Robin
    Cherubini, Andrea
    2018 IEEE WORKSHOP ON ADVANCED ROBOTICS AND ITS SOCIAL IMPACTS (ARSO), 2018, : 46 - 47
  • [4] Real Time Interaction with Mobile Robots using Hand Gestures
    Konda, Kishore
    Koenigs, Achim
    Schulz, Hannes
    Schulz, Dirk
    HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, : 177 - 178
  • [5] Real-Time Telerobotic Manipulator Control Using Hand Gestures
    Vaidyanathan, Krithika
    Chinnamuthu, Subramani
    Pon, Partheeban
    Ravindaran, Balaji
    Sundarraj, Nitish
    Vuppala, Sasidhar
    JOURNAL OF ELECTRICAL SYSTEMS, 2024, 20 (05) : 410 - 422
  • [6] Hand Swifter: A Real-Time Computer Controlling System Using Hand Gestures
    Sabab, Shahed Anzarus
    Islam, Sadman Saumik
    Hossain, Mainul
    Shahreen, Mahbuba
    2018 4TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING AND INFORMATION & COMMUNICATION TECHNOLOGY (ICEEICT), 2018, : 9 - 14
  • [7] Towards Real-time Physical Human-Robot Interaction using Skeleton Information and Hand Gestures
    Mazhar, Osama
    Ramdani, Sofiane
    Navarro, Benjamin
    Passama, Robin
    Cherubini, Andrea
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 7336 - 7341
  • [8] Real-time recognition of hand alphabet gestures using principal component analysis
    Birk, H
    Moeslund, TB
    Madsen, CB
    SCIA '97 - PROCEEDINGS OF THE 10TH SCANDINAVIAN CONFERENCE ON IMAGE ANALYSIS, VOLS 1 AND 2, 1997, : 261 - 268
  • [9] A real-time approach to the spotting, representation, and recognition of hand gestures for human-computer interaction
    Zhu, YX
    Xu, GY
    Kriegman, DJ
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2002, 85 (03) : 189 - 208
  • [10] Real-time hand gestures system based on leap motion
    Jia, Jing
    Tu, Geng
    Deng, Xin
    Zhao, Chuchu
    Yi, Wenlong
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2019, 31 (10):