3D Modeling of Indoor Environments Using KINECT Sensor

被引:0
|
作者
Majdi, Arafa [1 ]
Bakkay, Mohamed Chafik [1 ]
Zagrouba, Ezzeddine [1 ]
机构
[1] Univ Tunis El Manar, Inst Super Informat, Lab Riadi, Equipe SIIVA, Ariana 2080, Tunisia
关键词
3D modeling; RGB; Kinect; indoor environment; transparent objects;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
3D scene modeling for indoor environments has stirred significant interest in the last few years. The obtained photo-realistic rendering of internal structures are being used in a huge variety of civilian and military applications such as training, simulation, patrimonies conservation, localization and mapping. Whereas, building such complicated maps poses significant challenges for both computer vision and robotic communities (low lighting and textureless structures, transparent and specular surfaces, registration and fusion problems, coverage of all details, real time constraint, etc.). Recently, the Microsoft Kinect sensors, originally developed as a gaming interface, have received a great deal of attention as being able to produce high quality depth maps in real time. However, we realized that these active sensors failed completely on transparent and specular surfaces due to many technical causes. As these objects should be involved into the 3D model, we have investigated methods to inspect them without any modification of the hardware. In particular, the Structure from Motion (SFM) passive technique can be efficiently integrated to the reconstruction process to improve the detection of these surfaces. In fact, we proposed to fill the holes in the depth map provided by the Infrared (IR) kinect sensor with new values passively retrieved by the SFM technique. This helps to acquire additional huge amount of depth information in a relative short time from two consecutive RGB frames. To conserve the real time aspect of our approach we propose to select key-RGB-images instead of using all the available frames. The experiments show a strong improvement in the indoor reconstruction as well as transparent object inspection.
引用
收藏
页码:67 / 72
页数:6
相关论文
共 50 条
  • [41] Recognition of 3D object using Kinect
    Aigerim, Sagandykova
    Askhat, Askarbekuly
    Yedilkhan, Amirgaliyev
    2015 9TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2015, : 341 - 346
  • [42] 3D with Kinect
    Smisek, Jan
    Jancosek, Michal
    Pajdla, Tomas
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCV WORKSHOPS), 2011,
  • [43] 3D scene reconstruction using Kinect
    Morana, M. (marco.morana@unipa.it), 1600, Springer Verlag (260):
  • [44] Improved 3D Human Motion Capture Using Kinect Skeleton and Depth Sensor
    Bilesan, Alireza
    Komizunai, Shunsuke
    Tsujita, Teppei
    Konno, Atsushi
    JOURNAL OF ROBOTICS AND MECHATRONICS, 2021, 33 (06) : 1407 - 1421
  • [45] Multi Sensor 3D Indoor Localisation
    Ebner, Frank
    Fetzer, Toni
    Deinzer, Frank
    Koeping, Lukas
    Grzegorzek, Marcin
    2015 INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2015,
  • [46] 3D Semantic Modeling of Indoor Environments based on Point Clouds and Contextual Relationships
    Quijano, Angie
    Prieto, Flavio
    INGENIERIA, 2016, 21 (03): : 305 - 323
  • [47] Spatial subdivision of complex indoor environments for 3D indoor navigation
    Diakite, Abdoulaye A.
    Zlatanova, Sisi
    INTERNATIONAL JOURNAL OF GEOGRAPHICAL INFORMATION SCIENCE, 2018, 32 (02) : 213 - 235
  • [48] 3D modeling of complex environments
    El-Hakim, SF
    VIDEOMETRICS AND OPTICAL METHODS FOR 3D SHAPE MEASUREMENT, 2001, 4309 : 162 - 173
  • [49] 3D MAPPING OF INDOOR AND OUTDOOR ENVIRONMENTS USING APPLE SMART DEVICES
    Diaz-Vilarino, L.
    Tran, H.
    Frias, E.
    Balado, J.
    Khoshelham, K.
    XXIV ISPRS CONGRESS IMAGING TODAY, FORESEEING TOMORROW, COMMISSION IV, 2022, 43-B4 : 303 - 308
  • [50] Building 2D Maps with Integrated 3D and Visual Information using Kinect Sensor
    Brahmanage, Gayan
    Leung, Henry
    2019 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL CYBER PHYSICAL SYSTEMS (ICPS 2019), 2019, : 218 - 223