Three-dimensional point cloud segmentation using a combination of RANSAC and clustering methods

被引:2
|
作者
Singh, Puyam S. [1 ]
Nongsiej, Iainehborlang M. [2 ]
Marboh, Valarie [2 ]
Chutia, Dibyajyoti [1 ]
Saikhom, Victor [1 ]
Aggarwal, S. P. [1 ]
机构
[1] Govt India, North Eastern Space Applicat Ctr, Dept Space, Umiam 793103, India
[2] St Anthonys Coll, Dept Comp Sci, Shillong 793001, India
来源
CURRENT SCIENCE | 2023年 / 124卷 / 04期
关键词
Clustering; drone images; hierarchical model; three-dimensional point cloud; segmentation;
D O I
10.18520/cs/v124/i4/434-441
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
There are challenges in performing 3D scene understand-ing on point clouds derived from drone images as these data are highly unstructured with no neighbouring infor-mation, highly redundant making the processing difficult and time-consuming and have variable density making it difficult to group and segment them. For proper scene understanding, these point clouds need to be segmented and classified into different groups representing similar characteristics. The approaches for segmentation differ based on the distinctiveness of each data product. Alt-hough newer machine learning-based approaches work well, they need large amounts of standardized labelled data which in turn require extensive resources and human intervention to obtain good results. Considering these, we have proposed a hybrid clustering-based hierarchical model for effective segmentation of dense 3D point cloud. We have applied the model to local data having a mix of man-made and natural vegetation with variable topogra-phy. The combination of RANSAC, DBSCAN and Eucli-dean method of cluster extraction proved to be useful for precise segmentation and classification of point clouds. The performance of the model has been assessed using Davies-Bouldin dbIndex-based intrinsic measures. The hybrid approach is able to segment 91% of the point clouds precisely compared to the conventional one-step clustering approach.
引用
收藏
页码:434 / 441
页数:8
相关论文
共 50 条
  • [41] Fusion and Visualization of Three-Dimensional Point Cloud and Optical Images
    Zhang Jia
    Tang Yi
    Bian Ziyu
    Sun Tianyu
    Zhong Kaijie
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (06)
  • [42] ROBUST CYLINDER FITTING IN THREE-DIMENSIONAL POINT CLOUD DATA
    Nurunnabi, Abdul
    Sadahiro, Yukio
    Lindenbergh, Roderik
    ISPRS HANNOVER WORKSHOP: HRIGI 17 - CMRT 17 - ISA 17 - EUROCOW 17, 2017, 42-1 (W1): : 63 - 70
  • [43] Adaptive Nonrigid Inpainting of Three-Dimensional Point Cloud Geometry
    Dinesh, Chinthaka
    Bajic, Ivan, V
    Cheung, Gene
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (06) : 878 - 882
  • [44] Improved Three-Dimensional Reconstruction Algorithm for Point Cloud Data
    Pang Zhengya
    Zhou Zhifeng
    Wang Liduan
    Ye Juelei
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (02)
  • [45] THE FILTERING AND STREAMLINE OF THREE-DIMENSIONAL POINT-ClOUD-DATA
    Zhang, ZhengChang
    SENSORS, MECHATRONICS AND AUTOMATION, 2014, 511-512 : 554 - 558
  • [46] Towards three-dimensional point cloud reconstruction of fish swimming
    Karakaya, Mert
    Feng, Chen
    Porfiri, Maurizio
    BIOINSPIRATION, BIOMIMETICS, AND BIOREPLICATION X, 2020, 11374
  • [47] Feature Sensitive Three-Dimensional Point Cloud Simplification using Support Vector Regression
    Markovic, Veljko
    Jakovljevic, Zivana
    Miljkovic, Zoran
    TEHNICKI VJESNIK-TECHNICAL GAZETTE, 2019, 26 (04): : 985 - 994
  • [48] Cotton boll distribution and yield estimation using three-dimensional point cloud data
    Dube, Nothabo
    Bryant, Benjamin
    Sari-Sarraf, Hamed
    Ritchie, Glen L.
    AGRONOMY JOURNAL, 2020, 112 (06) : 4976 - 4989
  • [49] Distributed RANSAC for the robust estimation of three-dimensional reconstruction
    Xu, M.
    Lu, J.
    IET COMPUTER VISION, 2012, 6 (04) : 324 - 333
  • [50] Accelerated Coherent Point Drift for Automatic Three-Dimensional Point Cloud Registration
    Lu, Min
    Zhao, Jian
    Guo, Yulan
    Ma, Yanxin
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2016, 13 (02) : 162 - 166