Robust and Efficient Regularized Boosting Using Total Bregman Divergence

被引:0
|
作者
Liu, Meizhu [1 ]
Vemuri, Baba C. [1 ]
机构
[1] Univ Florida, CISE, Gainesville, FL 32611 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Boosting is a well known machine learning technique used to improve the performance of weak learners and has been successfully applied to computer vision, medical image analysis, computational biology and other fields. A critical step in boosting algorithms involves update of the data sample distribution, however, most existing boosting algorithms use updating mechanisms that lead to overfitting and instabilities during evolution of the distribution which in turn results in classification inaccuracies. Regularized boosting has been proposed in literature as a means to overcome these difficulties. In this paper, we propose a novel total Bregman divergence (tBD) regularized LPBoost, termed tBRLPBoost. tBD is a recently proposed divergence in literature, which is statistically robust and we prove that tBRLPBoost requires a constant number of iterations to learn a strong classifier and hence is computationally more efficient compared to other regularized boosting algorithms in literature. Also, unlike other boosting methods that are only effective on a handful of datasets, tBRLPBoost works well on a variety of datasets. We present results of testing our algorithm on many public domain databases along with comparisons to several other state-of-the-art methods. Numerical results depict much improvement in efficiency and accuracy over competing methods.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Total Variation Inpainting using Split Bregman
    Getreuer, Pascal
    IMAGE PROCESSING ON LINE, 2012, 2 : 147 - 157
  • [22] Total Variation Deconvolution using Split Bregman
    Getreuer, Pascal
    IMAGE PROCESSING ON LINE, 2012, 2 : 158 - 174
  • [23] Towards a median signal detector through the total Bregman divergence and its robustness analysis
    Ono, Yusuke
    Peng, Linyu
    SIGNAL PROCESSING, 2022, 201
  • [24] Robust Hyperspectral Unmixing Using Total Variation Regularized Low-rank Approximation
    Ince, Taner
    2019 9TH INTERNATIONAL CONFERENCE ON RECENT ADVANCES IN SPACE TECHNOLOGIES (RAST), 2019, : 373 - 379
  • [25] A Bregman Divergence Based Level Set Evolution for Efficient Medical Image Segmentatio n
    Dai, Shuanglu
    Man, Hong
    Zhan, Shu
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 1113 - 1118
  • [26] VARIABLE SELECTION FOR BAYESIAN SURVIVAL MODELS USING BREGMAN DIVERGENCE MEASURE
    Shi, Daoyuan
    Kuo, Lynn
    PROBABILITY IN THE ENGINEERING AND INFORMATIONAL SCIENCES, 2020, 34 (03) : 364 - 380
  • [27] Jensen-Bregman LogDet Divergence with Application to Efficient Similarity Search for Covariance Matrices
    Cherian, Anoop
    Sra, Suvrit
    Banerjee, Arindam
    Papanikolopoulos, Nikolaos
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (09) : 2161 - 2174
  • [28] Robust and efficient estimation in ordinal response models using the density power divergence
    Pyne, Arijit
    Roy, Subhrajyoty
    Ghosh, Abhik
    Basu, Ayanendranath
    STATISTICS, 2024, 58 (03) : 481 - 520
  • [29] Deblurring by Solving a TVP -Regularized Optimization Problem Using Split Bregman Method
    Xiao, Su
    ADVANCES IN MULTIMEDIA, 2014, 2014
  • [30] Efficient Similarity Search for Covariance Matrices via the Jensen-Bregman LogDet Divergence
    Cherian, Anoop
    Sra, Suvrit
    Banerjee, Arindam
    Papanikolopoulos, Nikolaos
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 2399 - 2406