Right mix of speech and non-speech: hybrid auditory feedback in mobility assistance of the visually impaired

被引:0
|
作者
Ibrar Hussain
Ling Chen
Hamid Turab Mirza
Gencai Chen
Saeed-Ul Hassan
机构
[1] Zhejiang University,College of Computer Science
[2] COMSATS Institute of Information Technology,Department of Computer Science
[3] Information Technology University,undefined
来源
Universal Access in the Information Society | 2015年 / 14卷
关键词
Mobility; Auditory feedback; Speech; Non-speech; Hybrid; Visually impaired;
D O I
暂无
中图分类号
学科分类号
摘要
Despite the growing awareness about mobility issues surrounding auditory interfaces used by visually impaired people, designers still face challenges while creating sound for auditory interfaces. This paper presents a new approach of hybrid auditory feedback, which converts frequently used speech instructions to non-speech (i.e., spearcons), based on users’ travelled frequency and sound repetition. Using a within-subject design, twelve participants (i.e., blind people) carried out a task, using a mobility assistant application in an indoor environment. As surfaced from the study results, the hybrid auditory feedback approach is more effective than non-speech and it is pleasant compared with repetitive speech-only. In addition, it can substantially improve user experience. Finally, these findings may help researchers and practitioners use hybrid auditory feedback, rather than using speech- or non-speech-only, when designing or creating accessibility/assistive products and systems.
引用
收藏
页码:527 / 536
页数:9
相关论文
共 50 条
  • [31] Development of the auditory N1 from age three to adulthood: Comparing speech and non-speech stimuli.
    Pang, EW
    Taylor, MJ
    JOURNAL OF COGNITIVE NEUROSCIENCE, 1999, : 98 - 98
  • [32] Electrical brain imaging evidences left auditory cortex involvement in speech and non-speech discrimination based on temporal features
    Tino Zaehle
    Lutz Jancke
    Martin Meyer
    Behavioral and Brain Functions, 3
  • [33] Right-shift for non-speech motor processing in adults who stutter
    Neef, Nicole E.
    Jung, Kristina
    Rothkegel, Holger
    Pollok, Bettina
    von Gudenberg, Alexander Wolff
    Paulus, Walter
    Sommer, Martin
    CORTEX, 2011, 47 (08) : 945 - 954
  • [34] Auditory Processing of Non-speech Stimuli by Children in Dual-Language Immersion Programs
    Jones, Chloe
    Collin, Elizabeth
    Kepinska, Olga
    Hancock, Roeland
    Caballero, Jocelyn
    Zekelman, Leo
    Vandermosten, Maaike
    Hoeft, Fumiko
    FRONTIERS IN PSYCHOLOGY, 2021, 12
  • [35] Impaired non-speech auditory processing at a pre-reading age is a risk-factor for dyslexia but not a predictor: An ERP study
    Plakas, Anna
    van Zuijen, Titia
    van Leeuwen, Theo
    Thomson, Jennifer M.
    van der Leij, Aryan
    CORTEX, 2013, 49 (04) : 1034 - 1045
  • [36] Auditory Displays in Human–Machine Interfaces of Mobile Robots for Non-Speech Communication with Humans
    Gunnar Johannsen
    Journal of Intelligent and Robotic Systems, 2001, 32 : 161 - 169
  • [37] Census-based Vision for Auditory Depth Images and Speech Navigation of Visually Impaired Users
    Pei, Soo-Chang
    Wang, Yu-Ying
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2011, 57 (04) : 1883 - 1890
  • [38] Children with amblyaudia show less flexibility in auditory cortical entrainment to periodic non-speech sounds
    Momtaz, Sara
    Moncrieff, Deborah
    Ray, Meredith A.
    Bidelman, Gavin M.
    INTERNATIONAL JOURNAL OF AUDIOLOGY, 2023, 62 (10) : 920 - 926
  • [39] Auditory displays in human-machine interfaces of mobile robots for non-speech communication with humans
    Johannsen, G
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2001, 32 (02) : 161 - 169
  • [40] Takeover and Handover Requests using Non-Speech Auditory Displays in Semi-Automated Vehicles
    Kutchek, Kyle
    Jeon, Myounghoon
    CHI EA '19 EXTENDED ABSTRACTS: EXTENDED ABSTRACTS OF THE 2019 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2019,