FloW Vision: Depth Image Enhancement by Combining Stereo RGB-Depth Sensor

被引:0
|
作者
Waskitho, Suryo Aji [1 ]
Alfarouq, Ardiansyah [1 ]
Sukaridhoto, Sritrusta [2 ]
Pramadihanto, Dadet [1 ]
机构
[1] Elect Engn Polytech Inst Surabaya, Dept Informat & Comp Engn, ER2C, Surabaya, Indonesia
[2] Elect Engn Polytech Inst Surabaya, Dept Multimedia Broadcasting Engn, Surabaya, Indonesia
关键词
Humanoid Robot; FloW; Robot Vision; RGB-D Sensor; Depth Calibration; Computer Vision;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human can recognize an object just by looking at the environment, this capability is very useful for designing the reference of humanoid robot with the ability of adapting it on its environment. By knowing the field conditions that exist in such environments, robot can understand the obstacles or anything that can be passed. To do that, robot vision needs to have a knowledge to understanding an obstacles that exist around it. We investigate possible improvements that can be achieved in depth estimation by merging coded apertures and stereo cameras. The demonstrated results of this analysis are encouraging in the sense that coded apertures can provide valuable complementary information to stereo vision based depth estimation in some cases. We show that with this system, it is possible to extract depth information robustly, by utilizing the inherent relation between the disparity and defocus cues, even for scene regions which are problematic for stereo matching.
引用
收藏
页码:182 / 187
页数:6
相关论文
共 50 条
  • [21] Multiple human tracking in RGB-depth data: a survey
    Camplani, Massimo
    Paiement, Adeline
    Mirmehdi, Majid
    Damen, Dima
    Hannuna, Sion
    Burghardt, Tilo
    Tao, Lili
    IET COMPUTER VISION, 2017, 11 (04) : 265 - 285
  • [22] Study on stairs detection using RGB-depth images
    20151400714274
    (1) Advanced Course of Electronics and Information Systems Engineering, Kumamoto National College of Technology, Koshi-shi, Kumamoto, Japan; (2) Dept. of Human-Oriented Information Systems Engineering, Kumamoto National College of Technology, Koshi-shi, Kumamoto, Japan; (3) Dept. of Human Intelligence Systems, Kyushu Institute of Technology, Kitakyushu-shi, Fukuoka, Japan, 1600, Japan Society for Fuzzy Theory and Intelligent Informatics (SOFT) (Institute of Electrical and Electronics Engineers Inc., United States):
  • [23] Radar and RGB-Depth Sensors for Fall Detection: A Review
    Cippitelli, Enea
    Fioranelli, Francesco
    Gambi, Ennio
    Spinsante, Susanna
    IEEE SENSORS JOURNAL, 2017, 17 (12) : 3585 - 3604
  • [24] Automatic Hand Detection in RGB-Depth Data Sequences
    Konovalov, Vitaliy
    Clapes, Albert
    Escalera, Sergio
    ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT: PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE OF THE CATALAN ASSOCIATION FOR ARTIFICIAL INTELLIGENCE, 2013, 256 : 91 - 100
  • [25] Detection and Utilization of Vertical Intersection in Feature-less Environment with RGB-Depth Sensor
    Choi, Hyunga
    Yeon, Suyong
    Doh, Nakju Lett
    2015 12TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2015, : 537 - 540
  • [26] Size estimation of sweet onions using consumer-grade RGB-depth sensor
    Wang, Weilin
    Li, Changying
    JOURNAL OF FOOD ENGINEERING, 2014, 142 : 153 - 162
  • [27] Human Activities Recognition with RGB-Depth Camera using HMM
    Dubois, Amandine
    Charpillet, Francois
    2013 35TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2013, : 4666 - 4669
  • [28] 利用RGB-Depth相机的机械模型建模
    林帅
    程志全
    系统仿真学报, 2013, 25 (09) : 2044 - 2049
  • [29] SASE: RGB-Depth Database for Human Head Pose Estimation
    Lusi, Iiris
    Escarela, Sergio
    Anbarjafari, Gholamreza
    COMPUTER VISION - ECCV 2016 WORKSHOPS, PT III, 2016, 9915 : 325 - 336
  • [30] Human activity recognition with analysis of angles between skeletal joints using a RGB-depth sensor
    Ince, Omer Faruk
    Ince, Ibrahim Furkan
    Yildirim, Mustafa Eren
    Park, Jang Sik
    Song, Jong Kwan
    Yoon, Byung Woo
    ETRI JOURNAL, 2020, 42 (01) : 78 - 89