A non-parametric mixture of Gaussian naive Bayes classifiers based on local independent features

被引:0
|
作者
Jahromi, Ali Haghpanah [1 ]
Taheri, Mohammad [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Shiraz, Iran
关键词
ensemble naive Bayes; local PCA; multi-modal classification; Gaussian naive Bayes;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The naive Bayes is one of the useful classification techniques in data mining and machine learning. Although naive Bayes learners are efficient, they suffer from the weak assumption of conditional independence between the attributes. Many algorithms have been proposed to improve the effectiveness of naive Bayes classifier by inserting discriminant approaches into its generative structure. Combining generative and discriminative viewpoints is done in many algorithms e.g. by use of attribute weighting, instance weighting or ensemble method. In this paper, a new ensemble of Gaussian naive Bayes classifiers is proposed based on the mixture of Gaussian distributions formed on less conditional dependent features extracted by local PCA. A semi-AdaBoost approach is used for dynamic adaptation of distributions considering misclassified instances. The proposed method has been evaluated and compared with the related work on 12 UCI machine learning datasets and achievements show significant improvement on the performance.
引用
收藏
页码:209 / 212
页数:4
相关论文
共 50 条
  • [31] Non-parametric expectation-maximization for Gaussian mixtures
    Sakuma, J
    Kobayashi, S
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 517 - 522
  • [32] Gaussian approximation of general non-parametric posterior distributions
    Shang, Zuofeng
    Cheng, Guang
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2018, 7 (03) : 509 - 529
  • [33] A kernelized non-parametric classifier based on feature ranking in anisotropic Gaussian kernel
    Sheikhpour, Razieh
    Sarram, Mehdi Agha
    Chahooki, Mohammad Ali Zare
    Sheikhpour, Robab
    NEUROCOMPUTING, 2017, 267 : 545 - 555
  • [34] Controlling the reinforcement in Bayesian non-parametric mixture models
    Lijoi, Antonio
    Mena, Ramses H.
    Prunster, Igor
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2007, 69 : 715 - 740
  • [35] MINIMUM DISTANCE NON-PARAMETRIC ESTIMATION OF MIXTURE PROPORTIONS
    TITTERINGTON, DM
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-METHODOLOGICAL, 1983, 45 (01): : 37 - 46
  • [36] Non-parametric e-mixture of Density Functions
    Hino, Hideitsu
    Takano, Ken
    Akaho, Shotaro
    Murata, Noboru
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 3 - 10
  • [37] Medical image segmentation based on non-parametric mixture models with spatial information
    Song, Yu-Qing
    Liu, Zhe
    Chen, Jian-Mei
    Zhu, Feng
    Xie, Cong-Hua
    SIGNAL IMAGE AND VIDEO PROCESSING, 2012, 6 (04) : 569 - 578
  • [38] Hybrid Learning Model for intrusion detection system: A combination of parametric and non-parametric classifiers
    Rajathi, C.
    Rukmani, P.
    ALEXANDRIA ENGINEERING JOURNAL, 2025, 112 : 384 - 396
  • [39] A Dictionary Based Approach for Non-parametric SPIN and Application to Image Mixture Separation
    Baburaj, M.
    George, Sudhish N.
    PROCEEDINGS OF THE 2015 IEEE RECENT ADVANCES IN INTELLIGENT COMPUTATIONAL SYSTEMS (RAICS), 2015, : 1 - 5
  • [40] Medical image segmentation based on non-parametric mixture models with spatial information
    Yu-Qing Song
    Zhe Liu
    Jian-Mei Chen
    Feng Zhu
    Cong-Hua Xie
    Signal, Image and Video Processing, 2012, 6 : 569 - 578