Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence

被引:50
|
作者
Zhang, Fan [1 ]
Liu, Yu [1 ]
Chen, Chujie [1 ]
Li, Yan-Feng [1 ]
Huang, Hong-Zhong [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech Elect & Ind Engn, Chengdu 611731, Sichuan, Peoples R China
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
Data-driven fault diagnosis; Kernel density estimation; Kullback-Leibler divergence; Ensemble empirical mode decomposition; ARTIFICIAL NEURAL-NETWORK; SUPPORT VECTOR MACHINE; PROBABILITY DENSITY; FEATURE-EXTRACTION; CLASSIFICATION; VIBRATION; DECOMPOSITION; TRANSFORM; ALGORITHM; MODE;
D O I
10.1007/s12206-014-1012-7
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Based on kernel density estimation (KDE) and Kullback-Leibler divergence (KLID), a new data-driven fault diagnosis method is proposed from a statistical perspective. The ensemble empirical mode decomposition (EEMD) together with the Hilbert transform is employed to extract 95 time-and frequency-domain features from raw and processed signals. The distance-based evaluation approach is used to select a subset of fault-sensitive features by removing the irrelevant features. By utilizing the KDE, the statistical distribution of selected features can be readily estimated without assuming any parametric family of distributions; whereas the KLID is able to quantify the discrepancy between two probability distributions of a selected feature before and after adding a testing sample. An integrated Kullback-Leibler divergence, which aggregates the KLID of all the selected features, is introduced to discriminate various fault modes/damage levels. The effectiveness of the proposed method is demonstrated via the case studies of fault diagnosis for bevel gears and rolling element bearings, respectively. The observations from the case studies show that the proposed method outperforms the support vector machine (SVM)-based and neural network-based fault diagnosis methods in terms of classification accuracy. Additionally, the influences of the number of selected features and the training sample size on the classification performance are examined by a set of comparative studies.
引用
收藏
页码:4441 / 4454
页数:14
相关论文
共 50 条
  • [21] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [22] Incipient Fault Online Estimation Based on Kullback-Leibler Divergence and Fast Moving Window PCA
    Tao, Songbing
    Chai, Yi
    Ngo Quang Vi
    IECON 2017 - 43RD ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2017, : 8065 - 8069
  • [23] ONLINE INCIPIENT FAULT DIAGNOSIS BASED ON KULLBACK-LEIBLER DIVERGENCE AND RECURSIVE PRINCIPLE COMPONENT ANALYSIS
    Chai, Yi
    Tao, Songbing
    Mao, Wanbiao
    Zhang, Ke
    Zhu, Zhiqin
    CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 2018, 96 (02): : 426 - 433
  • [24] Performance study of marginal posterior density estimation via Kullback-Leibler divergence
    Chen, MH
    Shao, QM
    TEST, 1997, 6 (02) : 321 - 350
  • [25] Fault-tolerant relative navigation based on Kullback-Leibler divergence
    Xiong, Jun
    Cheong, Joon Wayn
    Xiong, Zhi
    Dempster, Andrew G.
    Tian, Shiwei
    Wang, Rong
    Liu, Jianye
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2020, 17 (06)
  • [26] Kullback-Leibler divergence based wind turbine fault feature extraction
    Wu, Yueqi
    Ma, Xiandong
    2018 24TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND COMPUTING (ICAC' 18), 2018, : 483 - 488
  • [27] Induction Motor Fault Detection and Diagnosis using KDE and Kullback-Leibler Divergence
    Ferracuti, Francesco
    Giantomassi, Andrea
    Iarlori, Sabrina
    Ippoliti, Gianluca
    Longhi, Sauro
    39TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY (IECON 2013), 2013, : 2923 - 2928
  • [28] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [29] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [30] Sensor and Actuator Fault Diagnosis for a Multi-Robot System Based on the Kullback-Leibler Divergence
    Abci, Boussad
    El Najjar, Maan El Badaoui
    Cocquempot, Vincent
    2019 4TH CONFERENCE ON CONTROL AND FAULT TOLERANT SYSTEMS (SYSTOL), 2019, : 68 - 73