Active contour model based on local Kullback-Leibler divergence for fast image segmentation

被引:25
|
作者
Yang, Chengxin [1 ]
Weng, Guirong [1 ]
Chen, Yiyang [1 ]
机构
[1] Soochow Univ, Sch Mech & Elect Engn, 178 Ganjiang Rd, Suzhou 215021, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Image segmentation; Kullback-Leibler divergence; Level set method; Inhomogeneous intensity; Robustness; LEVEL SET EVOLUTION; DRIVEN; ENERGY;
D O I
10.1016/j.engappai.2023.106472
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The inhomogeneity of image intensity and noise are the main factors that affect the segmentation results. To overcome these challenges, a new active contour model is designed based on level set method and Kullback-Leibler Divergence. First of all, a new regional measurement of information scale is applied to construct energy functional, instead of Euclidean distance. Test results demonstrate that the Kullback-Leibler Divergence achieves a truly better segmentation. Then, a new Heaviside function has been proposed in this paper, which gives rise to a faster zero-crossing slope than traditional function. In this sense, it can stimulate the evolution of the level set function faster and allocate internal and external energy reasonably. In addition, the activation function has also been improved, which makes itself fluctuates over a smaller range than former activation function. Experiments reveal that the 'Local Kullback-Leibler Divergency' (LKLD) model has desired segmentation results both on real-world and medical images. Also, it owns a better noise robustness and is not limited to position of initial contour.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] AN EFFECTIVE IMAGE RESTORATION USING KULLBACK-LEIBLER DIVERGENCE MINIMIZATION
    Hanif, Muhammad
    Seghouane, Abd-Krim
    2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2014, : 4522 - 4526
  • [42] A KULLBACK-LEIBLER DIVERGENCE APPROACH FOR WAVELET-BASED BLIND IMAGE DECONVOLUTION
    Seghouane, Abd-Krim
    Hanif, Muhammad
    2012 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2012,
  • [43] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [44] Quantile-based cumulative Kullback-Leibler divergence
    Sunoj, S. M.
    Sankaran, P. G.
    Nair, N. Unnikrishnan
    STATISTICS, 2018, 52 (01) : 1 - 17
  • [45] Kullback-Leibler Divergence-Based Visual Servoing
    Li, Xiangfei
    Zhao, Huan
    Ding, Han
    2021 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2021, : 720 - 726
  • [46] Robust parameter design based on Kullback-Leibler divergence
    Zhou, XiaoJian
    Lin, Dennis K. J.
    Hu, Xuelong
    Jiang, Ting
    COMPUTERS & INDUSTRIAL ENGINEERING, 2019, 135 : 913 - 921
  • [47] NMF Algorithm Based on Extended Kullback-Leibler Divergence
    Gao, Liuyang
    Tian, Yinghua
    Lv, Pinpin
    Dong, Peng
    PROCEEDINGS OF 2019 IEEE 3RD INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2019), 2019, : 1804 - 1808
  • [48] Distributed Vector Quantization Based on Kullback-Leibler Divergence
    Shen, Pengcheng
    Li, Chunguang
    Luo, Yiliang
    ENTROPY, 2015, 17 (12) : 7875 - 7887
  • [49] Markov random field based on Kullback-Leibler divergence and its applications to geo-spatial image segmentation
    Nishii, R
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XVII, PROCEEDINGS: INDUSTRIAL SYSTEMS AND ENGINEERING III, 2002, : 399 - 405
  • [50] An Application of Kullback-Leibler Divergence to Active SLAM and Exploration with Particle Filters
    Carlone, Luca
    Du, Jingjing
    Ng, Miguel Kaouk
    Bona, Basilio
    Indri, Marina
    IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010, : 287 - 293