Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence

被引:50
|
作者
Zhang, Fan [1 ]
Liu, Yu [1 ]
Chen, Chujie [1 ]
Li, Yan-Feng [1 ]
Huang, Hong-Zhong [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech Elect & Ind Engn, Chengdu 611731, Sichuan, Peoples R China
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
Data-driven fault diagnosis; Kernel density estimation; Kullback-Leibler divergence; Ensemble empirical mode decomposition; ARTIFICIAL NEURAL-NETWORK; SUPPORT VECTOR MACHINE; PROBABILITY DENSITY; FEATURE-EXTRACTION; CLASSIFICATION; VIBRATION; DECOMPOSITION; TRANSFORM; ALGORITHM; MODE;
D O I
10.1007/s12206-014-1012-7
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Based on kernel density estimation (KDE) and Kullback-Leibler divergence (KLID), a new data-driven fault diagnosis method is proposed from a statistical perspective. The ensemble empirical mode decomposition (EEMD) together with the Hilbert transform is employed to extract 95 time-and frequency-domain features from raw and processed signals. The distance-based evaluation approach is used to select a subset of fault-sensitive features by removing the irrelevant features. By utilizing the KDE, the statistical distribution of selected features can be readily estimated without assuming any parametric family of distributions; whereas the KLID is able to quantify the discrepancy between two probability distributions of a selected feature before and after adding a testing sample. An integrated Kullback-Leibler divergence, which aggregates the KLID of all the selected features, is introduced to discriminate various fault modes/damage levels. The effectiveness of the proposed method is demonstrated via the case studies of fault diagnosis for bevel gears and rolling element bearings, respectively. The observations from the case studies show that the proposed method outperforms the support vector machine (SVM)-based and neural network-based fault diagnosis methods in terms of classification accuracy. Additionally, the influences of the number of selected features and the training sample size on the classification performance are examined by a set of comparative studies.
引用
收藏
页码:4441 / 4454
页数:14
相关论文
共 50 条
  • [1] Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence
    Fan Zhang
    Yu Liu
    Chujie Chen
    Yan-Feng Li
    Hong-Zhong Huang
    Journal of Mechanical Science and Technology, 2014, 28 : 4441 - 4454
  • [2] Electric Motor Fault Detection and Diagnosis by Kernel Density Estimation and Kullback-Leibler Divergence Based on Stator Current Measurements
    Giantomassi, Andrea
    Ferracuti, Francesco
    Iarlori, Sabrina
    Ippoliti, Gianluca
    Longhi, Sauro
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2015, 62 (03) : 1770 - 1780
  • [3] Electric motor defects diagnosis based on kernel density estimation and Kullback-Leibler divergence in quality control scenario
    Ferracuti, Francesco
    Giantomassi, Andrea
    Iarlori, Sabrina
    Ippoliti, Gianluca
    Longhi, Sauro
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2015, 44 : 25 - 32
  • [4] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [5] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    MATHEMATICS, 2021, 9 (05) : 1 - 36
  • [6] PARAMETER ESTIMATION BASED ON CUMULATIVE KULLBACK-LEIBLER DIVERGENCE
    Mehrali, Yaser
    Asadi, Majid
    REVSTAT-STATISTICAL JOURNAL, 2021, 19 (01) : 111 - 130
  • [7] Fault Diagnosis in Industrial Processes by Maximizing Pairwise Kullback-Leibler Divergence
    Lu, Qiugang
    Jiang, Benben
    Harinath, Eranda
    IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2021, 29 (02) : 780 - 785
  • [8] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [9] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [10] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232