Real-time Vision-aiding for Reliable 3D Indoor Location

被引:0
|
作者
Kazemipur, Bashir [1 ]
Syed, Zainab [2 ]
Georgy, Jacques [3 ]
El-Sheimy, Naser [4 ]
机构
[1] Univ Calgary, Dept Geomat Engn, InvenSense Canada, Calgary, AB T2N 1N4, Canada
[2] InvenSense Canada, Nav Engn, Boston, MA USA
[3] InvenSense Canada, Ebisu, Japan
[4] Univ Calgary, MMSS Res Grp, Calgary, AB T2N 1N4, Canada
关键词
D O I
暂无
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The problem of obtaining a long-term accurate positioning solution in indoor environments has been an emergent topic in the world of business and academia in recent years. In the absence of information from the Global Positioning System (GPS), the inertial sensors within a smartphone can be used to provide a relative navigation solution. However, these onboard Micro Electro Mechanical Systems (MEMS) based sensors suffer from the effects of different sensor errors which cause the inertial-only solution to deteriorate rapidly. As such, there is a need to constrain the inertial positioning solution when long-term navigation is needed. GPS positions and velocities, and WiFi positions are the most important forms of updates available for the inertial solution. However, updates from these two sources depend on external signals and infrastructure that may not always be available. Another problem specific to smartphone based navigation stems from the fact that a user does not use the device in one specific or unchanging orientation and the device is free to change orientation with respect to both the user and the direction of motion. To overcome these limitations, researchers are looking at other means of providing constraints to the inertial-only solution. One commercial product that attempts to fill the niche of a truly portable, low-cost, long-term accurate indoor positioning solution is the InvenSense Positioning App (IPA) developed by InvenSense Canada. The IPA is an inertial-based system running on smartphones that provides a 3D navigation solution even in the absence of information from the GPS. The IPA uses proprietary and patented algorithms to estimate the misalignment of the device with respect to the moving platform, making it agnostic to any specific device orientation. A rich source of information about the outside world can be obtained using the device's camera. Nearly all devices have at least one camera which has thus far been largely neglected as a navigation aid. Parameters extracted from the stream of images are used to aid several modules within the IPA. The vision aiding module performs context classification to provide device angle with respect to the platform, height change information in different scenarios, as well as static period and fidgeting detection. This information is used by the IPA in the form of misalignment, external height, and zero velocity updates. The results of the integration of the vision aiding module with the IPA show significant improvement in many cases. This work is patent pending.
引用
收藏
页码:2118 / 2131
页数:14
相关论文
共 50 条
  • [41] Real-time 3D vision solution for on-orbit autonomous rendezvous & docking
    Ruel, S.
    English, C.
    Anctil, A.
    Daly, J.
    Smith, C.
    Zhu, S.
    SPACEBORNE SENSORS III, 2006, 6220
  • [42] Event-Driven Stereo Matching for Real-Time 3D Panoramic Vision
    Schraml, Stephan
    Belbachir, Ahmed Nabil
    Bischof, Horst
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 466 - 474
  • [43] Real-Time Object and Personnel Tracking in Indoor Location
    Tasbas, Ahmet Semih
    Erdal, Emre
    Ozdemir, Suat
    2019 4TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ENGINEERING (UBMK), 2019, : 585 - 590
  • [44] Real-time 3D ultrasound laparoscopy
    Light, ED
    Dixon-Tulloch, EG
    Wolf, PD
    Smith, SW
    Idriss, SF
    2005 IEEE Ultrasonics Symposium, Vols 1-4, 2005, : 796 - 799
  • [45] Real-time 3D transesophageal echocardiography
    Pua, EC
    Idriss, SF
    Wolf, PD
    Smith, SW
    2004 IEEE Ultrasonics Symposium, Vols 1-3, 2004, : 778 - 781
  • [46] Real-time 3D ultrasound imaging
    Ponomaryov, VI
    Sansores-Pech, R
    Gallegos-Funes, F
    Real-Time Imaging IX, 2005, 5671 : 19 - 29
  • [47] Real-time 3D rendering with hatching
    Jordane Suarez
    Farès Belhadj
    Vincent Boyer
    The Visual Computer, 2017, 33 : 1319 - 1334
  • [48] 3D MR imaging in real-time
    Guttman, MA
    McVeigh, ER
    MEDICAL IMAGING 2001: VISUALIZATION, DISPLAY, AND IMAGE-GUIDED PROCEDURES, 2001, 4319 : 394 - 400
  • [49] REAL-TIME 3D OBJECT TRACKING
    STEPHENS, RS
    IMAGE AND VISION COMPUTING, 1990, 8 (01) : 91 - 96
  • [50] 3D Real-Time Supercomputer Monitoring
    Bergeron, Bill
    Hubbell, Matthew
    Sequeira, Dylan
    Williams, Winter
    Arcand, William
    Bestor, David
    Chansup
    Byun
    Gadepally, Vijay
    Houle, Michael
    Jones, Michael
    Klien, Anna
    Michaleas, Peter
    Milechin, Lauren
    Prout, Julie Mullen Andrew
    Reuther, Albert
    Rosa, Antonio
    Samsi, Siddharth
    Yee, Charles
    Kepner, Jeremy
    2021 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2021,