A Multi-Sensor Fusion System for Improving Indoor Mobility of the Visually Impaired

被引:0
|
作者
Zhao, Yu [1 ]
Huang, Ran [1 ]
Hu, Biao [1 ]
机构
[1] Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
基金
中国国家自然科学基金;
关键词
Assistive navigation; semantic SLAM; visually impaired; BLIND PEOPLE; NAVIGATION; FRAMEWORK;
D O I
10.1109/cac48633.2019.8996578
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Independent movement in an unknown indoor environment is a challenging task for the visually impaired. By considering the connectivity of the corridor (room doors and stairs are all connected by the corridor), we propose an assistive navigation system to help visually impaired users navigate in corridor environments in this paper. Based on semantic simultaneous localization and mapping (SLAM), the corridor area is determined and mapped to a semantic map. Semantic path planning is then performed according to the lowest energy cost principle by taking the safety into consideration. The YOLO neural network is employed to detect and identify common indoor landmarks such as Toilet, EXIT, Staircase et al., and the system is able to give voice feedback about objects along its line of sight during the movement. This interaction helps to enhance perception about objects and places to improve travel decisions. A TurtleBot2 robot with a laptop, a RPLIDAR A2, and a Microsoft Kinect V1 are utilized to validate the localization, mapping and navigation module, while the perception module uses a ZED stereo camera to capture the objects and landmarks along its line of sight. The software modules of this system are implemented in Robot Operating System (ROS) and tested in our lab building.
引用
收藏
页码:2950 / 2955
页数:6
相关论文
共 50 条
  • [21] Design of a Hybrid Indoor Location System Based on Multi-Sensor Fusion for Robot Navigation
    Shi, Yongliang
    Zhang, Weimin
    Yao, Zhuo
    Li, Mingzhu
    Liang, Zhenshuo
    Cao, Zhongzhong
    Zhang, Hua
    Huang, Qiang
    SENSORS, 2018, 18 (10)
  • [22] Evaluation of Multi-Sensor Fusion Methods for Ultrasonic Indoor Positioning
    Mannay, Khaoula
    Urena, Jesus
    Hernandez, Alvaro
    Villadangos, Jose M.
    Machhout, Mohsen
    Aguili, Taoufik
    APPLIED SCIENCES-BASEL, 2021, 11 (15):
  • [23] Robot Localization in Indoor and Outdoor Environments by Multi-sensor Fusion
    Yousuf, Sofia
    Kadri, Muhammad Bilal
    2018 14TH INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES (ICET), 2018,
  • [24] A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor
    Kanwal, Nadia
    Bostanci, Erkan
    Currie, Keith
    Clark, Adrian F.
    APPLIED BIONICS AND BIOMECHANICS, 2015, 2015
  • [25] Data Fusion in Distributed Multi-sensor System
    GUO Hang YU Min
    Geo-Spatial Information Science, 2004, (03) : 214 - 217
  • [26] System Identification for Multi-Sensor Data Fusion
    Hernandez, Karla
    Spall, James C.
    2015 AMERICAN CONTROL CONFERENCE (ACC), 2015, : 3931 - 3936
  • [27] SFINX: A multi-sensor fusion and mining system
    Dimitrijevic, Z
    Wu, G
    Chang, EY
    ICICS-PCM 2003, VOLS 1-3, PROCEEDINGS, 2003, : 1128 - 1132
  • [28] Population Mobility Chat Based on Multi-Sensor Data Fusion
    Lin, KaiYung
    Sun, Xibin
    Chen, Wenwen
    PROCEEDINGS OF 4TH IEEE INTERNATIONAL CONFERENCE ON APPLIED SYSTEM INNOVATION 2018 ( IEEE ICASI 2018 ), 2018, : 582 - 584
  • [29] A Multi-Sensor Navigation System for Outdoor and Indoor Environments
    Mueller, Karsten
    Atman, Jamal
    Kronenwett, Nikolai
    Trommer, Gert F.
    PROCEEDINGS OF THE 2020 INTERNATIONAL TECHNICAL MEETING OF THE INSTITUTE OF NAVIGATION, 2020, : 612 - 625
  • [30] Multi-Sensor Indoor Positioning
    Ayabakan, Tarik
    Kerestecioglu, Feza
    2019 4TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ENGINEERING (UBMK), 2019, : 330 - 335