FEEL TECHWear : Enhancing Mixed Reality Experience with Wrist to Finger Haptic

被引:0
|
作者
Umehara, Rodan [1 ]
Taguchi, Harunobu [1 ]
Horie, Arata [1 ,2 ]
Kamiyama, Yusuke [3 ]
Sakamoto, Shin [3 ]
Ishikawa, Hironori [4 ]
Minamizawa, Kouta [1 ]
机构
[1] Keio Univ, Grad Sch Media Design, Yokohama, Kanagawa, Japan
[2] Commissure Inc, Tokyo, Japan
[3] SPLINE DESIGN HUB Corp, Tokyo, Japan
[4] NTT DOCOMO INC, Yokosuka, Kanagawa, Japan
关键词
mixed reality; haptics; skin-stretch; vibration; haptic attribution; palm free;
D O I
10.1145/3641517.3664395
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
FEEL TECH Wear is a system that facilitates haptic interactions while keeping most of the palm free, by presenting directional force through rotational skin-stretch distribution feedback to the wrist and providing texture sensation through vibration feedback to the fingertips. With advancements in hand tracking and passthrough technologies, hand interactions in Mixed Reality (MR) environments have become more accessible, necessitating palm-free haptic feedback methods that do not hinder interactions with real objects or impair vision-based hand tracking. The hardware of FEEL TECH Wear primarily consists of two components: a hand-mounted device for each hand and a control unit located at the back of the head. The hand-mounted device is equipped with four channels of rotational skin-stretch tactors at the wrist and vibration tactors at the thumb and index finger. Using FEEL TECH Wear, three applications have been realized: haptic feedback for virtual objects, haptic augmentation for real objects, and haptic guidance towards objects.
引用
收藏
页数:2
相关论文
共 50 条
  • [41] Preclinical prosthodontic training with mixed reality haptic-based dental simulator
    Turkyilmaz, Ilser
    Marshall, Lindsay Simone
    JOURNAL OF DENTAL SCIENCES, 2023, 18 (02) : 905 - 906
  • [42] Visuo-Haptic Mixed Reality Simulation Using Unbound Handheld Tools
    Aygun, Mehmet Murat
    Ogut, Yusuf Cagri
    Baysal, Hulusi
    Tascioglu, Yigit
    APPLIED SCIENCES-BASEL, 2020, 10 (15):
  • [43] Predicting Hand-Object Interaction for Improved Haptic Feedback in Mixed Reality
    Salvato, M.
    Heravi, Negin
    Okamura, Allison M.
    Bohg, Jeannette
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 3851 - 3857
  • [44] A Mixed Reality System to Improve Walking Experience
    Chao, Tz-Yang
    Lin, Shien-Fong
    2018 INTERNATIONAL CONFERENCE ON SYSTEM SCIENCE AND ENGINEERING (ICSSE), 2018,
  • [45] Fear Inducer: A Mixed Reality Audio Experience
    Westerhoff, Jurgen
    ENTERTAINMENT COMPUTING - ICEC 2008, 2008, 5309 : 88 - 93
  • [46] REMLABNET - User Experience and Mixed Reality Continuum
    Komenda, Tomas
    Schauer, Franz
    INTERNATIONAL JOURNAL OF ONLINE ENGINEERING, 2018, 14 (02) : 38 - 47
  • [47] See it, hear it, feel it: embodying a patient experience through immersive virtual reality
    Hannans, Jaime A.
    Nevins, Colleen M.
    Jordan, Kristin
    INFORMATION AND LEARNING SCIENCES, 2021, 122 (7-8) : 565 - 583
  • [48] Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality
    Teng, Shan-Yuan
    Li, Pengyu
    Nith, Romain
    Fonseca, Joshua
    Lopes, Pedro
    CHI '21: PROCEEDINGS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2021,
  • [49] Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration
    Cosco, Francesco
    Garre, Carlos
    Bruno, Fabio
    Muzzupappa, Maurizio
    Otaduy, Miguel A.
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2013, 19 (01) : 159 - 172
  • [50] Enhancing Body Ownership of Avian Avatars in Virtual Reality through Multimodal Haptic Feedback
    Wang, Ziqi
    Gao, Ze
    2024 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW 2024, 2024, : 713 - 714