Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence

被引:50
|
作者
Zhang, Fan [1 ]
Liu, Yu [1 ]
Chen, Chujie [1 ]
Li, Yan-Feng [1 ]
Huang, Hong-Zhong [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech Elect & Ind Engn, Chengdu 611731, Sichuan, Peoples R China
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
Data-driven fault diagnosis; Kernel density estimation; Kullback-Leibler divergence; Ensemble empirical mode decomposition; ARTIFICIAL NEURAL-NETWORK; SUPPORT VECTOR MACHINE; PROBABILITY DENSITY; FEATURE-EXTRACTION; CLASSIFICATION; VIBRATION; DECOMPOSITION; TRANSFORM; ALGORITHM; MODE;
D O I
10.1007/s12206-014-1012-7
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Based on kernel density estimation (KDE) and Kullback-Leibler divergence (KLID), a new data-driven fault diagnosis method is proposed from a statistical perspective. The ensemble empirical mode decomposition (EEMD) together with the Hilbert transform is employed to extract 95 time-and frequency-domain features from raw and processed signals. The distance-based evaluation approach is used to select a subset of fault-sensitive features by removing the irrelevant features. By utilizing the KDE, the statistical distribution of selected features can be readily estimated without assuming any parametric family of distributions; whereas the KLID is able to quantify the discrepancy between two probability distributions of a selected feature before and after adding a testing sample. An integrated Kullback-Leibler divergence, which aggregates the KLID of all the selected features, is introduced to discriminate various fault modes/damage levels. The effectiveness of the proposed method is demonstrated via the case studies of fault diagnosis for bevel gears and rolling element bearings, respectively. The observations from the case studies show that the proposed method outperforms the support vector machine (SVM)-based and neural network-based fault diagnosis methods in terms of classification accuracy. Additionally, the influences of the number of selected features and the training sample size on the classification performance are examined by a set of comparative studies.
引用
收藏
页码:4441 / 4454
页数:14
相关论文
共 50 条
  • [31] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376
  • [32] Anomaly detection based on probability density function with Kullback-Leibler divergence
    Wang, Wei
    Zhang, Baoju
    Wang, Dan
    Jiang, Yu
    Qin, Shan
    SIGNAL PROCESSING, 2016, 126 : 12 - 17
  • [33] Alternatives to maximum likelihood estimation based on spacings and the Kullback-Leibler divergence
    Ekstrom, Magnus
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2008, 138 (06) : 1778 - 1791
  • [34] Performance study of marginal posterior density estimation via Kullback-Leibler divergence
    Ming-Hui Chen
    Qi-Man Shao
    Test, 1997, 6 : 321 - 350
  • [35] Fault detection in dynamic systems using the Kullback-Leibler divergence
    Xie, Lei
    Zeng, Jiusun
    Kruger, Uwe
    Wang, Xun
    Geluk, Jaap
    CONTROL ENGINEERING PRACTICE, 2015, 43 : 39 - 48
  • [36] ON KULLBACK-LEIBLER LOSS AND DENSITY-ESTIMATION
    HALL, P
    ANNALS OF STATISTICS, 1987, 15 (04): : 1491 - 1519
  • [37] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [38] A Satellite Incipient Fault Detection Method Based on Decomposed Kullback-Leibler Divergence
    Zhang, Ge
    Yang, Qiong
    Li, Guotong
    Leng, Jiaxing
    Yan, Mubiao
    ENTROPY, 2021, 23 (09)
  • [39] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [40] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975