A Multi-Sensor Fusion System for Improving Indoor Mobility of the Visually Impaired

被引:0
|
作者
Zhao, Yu [1 ]
Huang, Ran [1 ]
Hu, Biao [1 ]
机构
[1] Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
基金
中国国家自然科学基金;
关键词
Assistive navigation; semantic SLAM; visually impaired; BLIND PEOPLE; NAVIGATION; FRAMEWORK;
D O I
10.1109/cac48633.2019.8996578
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Independent movement in an unknown indoor environment is a challenging task for the visually impaired. By considering the connectivity of the corridor (room doors and stairs are all connected by the corridor), we propose an assistive navigation system to help visually impaired users navigate in corridor environments in this paper. Based on semantic simultaneous localization and mapping (SLAM), the corridor area is determined and mapped to a semantic map. Semantic path planning is then performed according to the lowest energy cost principle by taking the safety into consideration. The YOLO neural network is employed to detect and identify common indoor landmarks such as Toilet, EXIT, Staircase et al., and the system is able to give voice feedback about objects along its line of sight during the movement. This interaction helps to enhance perception about objects and places to improve travel decisions. A TurtleBot2 robot with a laptop, a RPLIDAR A2, and a Microsoft Kinect V1 are utilized to validate the localization, mapping and navigation module, while the perception module uses a ZED stereo camera to capture the objects and landmarks along its line of sight. The software modules of this system are implemented in Robot Operating System (ROS) and tested in our lab building.
引用
收藏
页码:2950 / 2955
页数:6
相关论文
共 50 条
  • [41] Research on the role of multi-sensor system information fusion in improving hardware control accuracy of intelligent system
    Li, Xin
    Li, Yuesong
    NONLINEAR ENGINEERING - MODELING AND APPLICATION, 2024, 13 (01):
  • [42] Mobility Recognition System For The Visually Impaired
    Abdullah, Shapina
    Noor, Noorhayati Mohamed
    Ghazali, Mohd Zaki
    2014 IEEE 2ND INTERNATIONAL SYMPOSIUM ON TELECOMMUNICATION TECHNOLOGIES (ISTT), 2014, : 362 - 367
  • [43] Internet of Things Based Multi-Sensor Fusion For Assistive Mobility Devices
    Daniel, Oladele Ayo
    Markus, Elisha Didam
    Abu-Mahfouz, Adnan M.
    2021 CONFERENCE ON INFORMATION COMMUNICATIONS TECHNOLOGY AND SOCIETY (ICTAS), 2021, : 115 - 120
  • [44] Embedded System Vehicle Based on Multi-Sensor Fusion
    Tong, Rui
    Jiang, Quan
    Zou, Zuqi
    Hu, Tao
    Li, Tianhao
    IEEE ACCESS, 2023, 11 : 50334 - 50349
  • [45] A System for Activity Recognition Using Multi-Sensor Fusion
    Gao, Lei
    Bourke, Alan K.
    Nelson, John
    2011 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2011, : 7869 - 7872
  • [46] Calibration of multi-sensor fusion for autonomous vehicle system
    Lu, Yongkang
    Zhong, Wenjian
    Li, Yanzhou
    INTERNATIONAL JOURNAL OF VEHICLE DESIGN, 2023, 91 (1-3) : 248 - 262
  • [47] Target Tracking System for Multi-sensor Data Fusion
    Ma, Ke
    Zhang, Hanguang
    Wang, Rentao
    Zhang, Zhimin
    PROCEEDINGS OF 2017 IEEE 2ND INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC), 2017, : 1768 - 1772
  • [48] Multi-sensor fusion UAV dynamic landing system
    Zhang Z.-H.
    Sun W.
    Zhao C.-Y.
    Guangxue Jingmi Gongcheng/Optics and Precision Engineering, 2017, 25 : 151 - 159
  • [49] Fuzzy Reliability Analysis of A Multi-sensor Fusion System
    Jiang, MingHua
    Hu, Ming
    Peng, Tao
    Ding, YiXiang
    FIFTH INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, VOL 3, PROCEEDINGS, 2008, : 81 - 84
  • [50] Observable Degree Analysis for Multi-Sensor Fusion System
    Hu, Zhentao
    Chen, Tianxiang
    Ge, Quanbo
    Wang, Hebin
    SENSORS, 2018, 18 (12)