Robust and Efficient Regularized Boosting Using Total Bregman Divergence

被引:0
|
作者
Liu, Meizhu [1 ]
Vemuri, Baba C. [1 ]
机构
[1] Univ Florida, CISE, Gainesville, FL 32611 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Boosting is a well known machine learning technique used to improve the performance of weak learners and has been successfully applied to computer vision, medical image analysis, computational biology and other fields. A critical step in boosting algorithms involves update of the data sample distribution, however, most existing boosting algorithms use updating mechanisms that lead to overfitting and instabilities during evolution of the distribution which in turn results in classification inaccuracies. Regularized boosting has been proposed in literature as a means to overcome these difficulties. In this paper, we propose a novel total Bregman divergence (tBD) regularized LPBoost, termed tBRLPBoost. tBD is a recently proposed divergence in literature, which is statistically robust and we prove that tBRLPBoost requires a constant number of iterations to learn a strong classifier and hence is computationally more efficient compared to other regularized boosting algorithms in literature. Also, unlike other boosting methods that are only effective on a handful of datasets, tBRLPBoost works well on a variety of datasets. We present results of testing our algorithm on many public domain databases along with comparisons to several other state-of-the-art methods. Numerical results depict much improvement in efficiency and accuracy over competing methods.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Efficient robust doubly adaptive regularized regression with applications
    Karunamuni, Rohana J.
    Kong, Linglong
    Tu, Wei
    STATISTICAL METHODS IN MEDICAL RESEARCH, 2019, 28 (07) : 2210 - 2226
  • [32] Nonlinear material decomposition using a regularized iterative scheme based on the Bregman distance
    Abascal, J. F. P. J.
    Ducros, N.
    Peyrin, F.
    INVERSE PROBLEMS, 2018, 34 (12)
  • [33] A Robust and Efficient Doubly Regularized Metric Learning Approach
    Liu, Meizhu
    Vemuri, Baba C.
    COMPUTER VISION - ECCV 2012, PT IV, 2012, 7575 : 646 - 659
  • [34] RudinOsherFatemi Total Variation Denoising using Split Bregman
    Getreuer, Pascal
    IMAGE PROCESSING ON LINE, 2012, 2 : 74 - 95
  • [35] A Robust and Regularized Algorithm for Recursive Total Least Squares Estimation
    Koide, Hugo
    Vayssettes, Jeremy
    Mercere, Guillaume
    IEEE CONTROL SYSTEMS LETTERS, 2024, 8 : 1006 - 1011
  • [36] Efficient algorithms for solution of regularized total least squares
    Renaut, RA
    Guo, HB
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2004, 26 (02) : 457 - 476
  • [37] An efficient framework of Bregman divergence optimization for co-ranking images and tags in a heterogeneous network
    Wu, Lin
    Huang, Xiaodi
    Zhang, Chengyuan
    Shepherd, John
    Wang, Yang
    MULTIMEDIA TOOLS AND APPLICATIONS, 2015, 74 (15) : 5635 - 5660
  • [38] An efficient framework of Bregman divergence optimization for co-ranking images and tags in a heterogeneous network
    Lin Wu
    Xiaodi Huang
    Chengyuan Zhang
    John Shepherd
    Yang Wang
    Multimedia Tools and Applications, 2015, 74 : 5635 - 5660
  • [39] A novel kernelized total Bregman divergence-based fuzzy clustering with local information for image segmentation
    Wu, Chengmao
    Zhang, Xue
    International Journal of Approximate Reasoning, 2021, 136 : 281 - 305
  • [40] Robust and efficient estimation by minimising a density power divergence
    Basu, A
    Harris, IR
    Hjort, NL
    Jones, MC
    BIOMETRIKA, 1998, 85 (03) : 549 - 559