Mobile Crowd Assisted Navigation for the Visually Impaired

被引:3
|
作者
Olmschenk, Greg [1 ,2 ]
Yang, Christopher [2 ]
Zhu, Zhigang [1 ,2 ]
Tong, Hanghang [3 ]
Seiple, William H. [4 ]
机构
[1] CUNY, Grad Ctr, New York, NY 10016 USA
[2] CUNY City Coll, New York, NY USA
[3] Arizona State Univ, Tempe, AZ 85287 USA
[4] Lighthouse Guild, Arlene R Gordon Res Inst, New York, NY USA
基金
美国国家科学基金会;
关键词
crowd sourcing; mobile; visually impaired; assistive technology; navigation;
D O I
10.1109/UIC-ATC-ScalCom-CBDCom-IoP.2015.69
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The World Health Organization estimates that 285 million people are visually impaired worldwide: 39 million are blind and 246 million have low vision. In order to improve the overall situation without having the user feel encumbered, our Crowd Assisted Navigation app is designed for smartphones (including both iPhones and Android phones), which are by far the most commonly used mobile devices among those with low vision. Many individuals would rather forget their wallets at home than their phones. A smartphone is easily accessible and its use does not attract undue attention towards the user's need of aid for his/her disability. The app's primary objective is to assist a visually impaired or blind user in navigating from point A to point B through reliable directions given from an online community. The phone is able to stream live video to a crowd of sighted users through our website. The crowd is then able to give directions from the website with the push of one of the four arrow keys, indicating either left, right, forward, or stop. The aggregation of these directions will be relayed back to the user by audio.
引用
收藏
页码:324 / 327
页数:4
相关论文
共 50 条
  • [41] Robot-Assisted Navigation for Visually Impaired through Adaptive Impedance and Path Planning
    Balatti, Pietro
    Ozdamar, Idil
    Sirintuna, Doganay
    Fortini, Luca
    Leonori, Mania
    Gandarias, Juan M.
    Ajoudani, Arash
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 2310 - 2316
  • [42] A Review on Path Selection and Navigation Approaches Towards an Assisted Mobility of Visually Impaired People
    Nawaz, Waqas
    Khan, Kifayat Ullah
    Bashir, Khalid
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2020, 14 (08) : 3270 - 3294
  • [43] Mobile navigation guide for the visually disabled
    Huang, B
    Liu, N
    TRANSPORTATION MANAGEMENT AND PUBLIC POLICY 2004, 2004, (1885): : 28 - 34
  • [44] Developing a navigation aid for the frail and visually impaired
    Pontus Engelbrektsson
    I. C. Marianne Karlsson
    Blaithin Gallagher
    Heather Hunter
    Helen Petrie
    Ann-Marie O’Neill
    Universal Access in the Information Society, 2004, 3 (3-4) : 194 - 201
  • [45] Screen navigation system for visually impaired people
    Ohene-Djan, James Francis
    Fernando, Sandra A.
    JOURNAL OF ENABLING TECHNOLOGIES, 2018, 12 (03) : 114 - 128
  • [46] Navigation for Visually Impaired Using Haptic Feedback
    Fagernes, Siri
    Gronli, Tor-Morten
    HUMAN-COMPUTER INTERACTION: INTERACTION IN CONTEXT, HCI INTERNATIONAL 2018, PT II, 2018, 10902 : 347 - 356
  • [47] A Smart Wearable Navigation System for Visually Impaired
    Trent, Michael
    Abdelgawad, Ahmed
    Yelamarthi, Kumar
    SMART OBJECTS AND TECHNOLOGIES FOR SOCIAL GOOD, 2017, 195 : 333 - 341
  • [48] The development of the navigation system for visually impaired persons
    Hashimoto, H
    Magatani, K
    Yanashima, K
    PROCEEDINGS OF THE 23RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-4: BUILDING NEW BRIDGES AT THE FRONTIERS OF ENGINEERING AND MEDICINE, 2001, 23 : 1481 - 1483
  • [49] An intelligent assistant for navigation of visually impaired people
    Bourbakis, NG
    Kavraki, D
    2ND ANNUAL IEEE INTERNATIONAL SYMPOSIUM ON BIOINFORMATICS AND BIOENGINEERING, PROCEEDINGS, 2001, : 230 - 235
  • [50] Event Venue Navigation for Visually Impaired People
    Watanabe, Chiemi
    Minagawa, Jun
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2019, : 599 - 604