Fault diagnosis of rotating machinery based on kernel density estimation and Kullback-Leibler divergence

被引:50
|
作者
Zhang, Fan [1 ]
Liu, Yu [1 ]
Chen, Chujie [1 ]
Li, Yan-Feng [1 ]
Huang, Hong-Zhong [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Mech Elect & Ind Engn, Chengdu 611731, Sichuan, Peoples R China
基金
高等学校博士学科点专项科研基金; 中国国家自然科学基金;
关键词
Data-driven fault diagnosis; Kernel density estimation; Kullback-Leibler divergence; Ensemble empirical mode decomposition; ARTIFICIAL NEURAL-NETWORK; SUPPORT VECTOR MACHINE; PROBABILITY DENSITY; FEATURE-EXTRACTION; CLASSIFICATION; VIBRATION; DECOMPOSITION; TRANSFORM; ALGORITHM; MODE;
D O I
10.1007/s12206-014-1012-7
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Based on kernel density estimation (KDE) and Kullback-Leibler divergence (KLID), a new data-driven fault diagnosis method is proposed from a statistical perspective. The ensemble empirical mode decomposition (EEMD) together with the Hilbert transform is employed to extract 95 time-and frequency-domain features from raw and processed signals. The distance-based evaluation approach is used to select a subset of fault-sensitive features by removing the irrelevant features. By utilizing the KDE, the statistical distribution of selected features can be readily estimated without assuming any parametric family of distributions; whereas the KLID is able to quantify the discrepancy between two probability distributions of a selected feature before and after adding a testing sample. An integrated Kullback-Leibler divergence, which aggregates the KLID of all the selected features, is introduced to discriminate various fault modes/damage levels. The effectiveness of the proposed method is demonstrated via the case studies of fault diagnosis for bevel gears and rolling element bearings, respectively. The observations from the case studies show that the proposed method outperforms the support vector machine (SVM)-based and neural network-based fault diagnosis methods in terms of classification accuracy. Additionally, the influences of the number of selected features and the training sample size on the classification performance are examined by a set of comparative studies.
引用
收藏
页码:4441 / 4454
页数:14
相关论文
共 50 条
  • [41] Kullback-Leibler Divergence Based Kernel SOM for Visualization of Damage Process on Fuel Cells
    Fukui, Ken-ichi
    Sato, Kazuhisa
    Mizusaki, Junichiro
    Numao, Masayuki
    22ND INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2010), PROCEEDINGS, VOL 1, 2010,
  • [42] Kullback-Leibler divergence -based Improved Particle Filter
    Mansouri, Majdi
    Nounou, Hazem
    Nounou, Mohamed
    2014 11TH INTERNATIONAL MULTI-CONFERENCE ON SYSTEMS, SIGNALS & DEVICES (SSD), 2014,
  • [43] ON INFORMATION GAIN, KULLBACK-LEIBLER DIVERGENCE, ENTROPY PRODUCTION AND THE INVOLUTION KERNEL
    Lopes, Artur O.
    Mengue, Jairo K.
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS, 2022, 42 (07) : 3593 - 3627
  • [44] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046
  • [45] Kullback-Leibler divergence for evaluating bioequivalence
    Dragalin, V
    Fedorov, V
    Patterson, S
    Jones, B
    STATISTICS IN MEDICINE, 2003, 22 (06) : 913 - 930
  • [46] Kullback-Leibler divergence: A quantile approach
    Sankaran, P. G.
    Sunoj, S. M.
    Nair, N. Unnikrishnan
    STATISTICS & PROBABILITY LETTERS, 2016, 111 : 72 - 79
  • [47] Estimation of discrepancy of color qualia using Kullback-Leibler divergence
    Yamada, Miku
    Matsumoto, Miu
    Arakaki, Mina
    Hebishima, Hana
    Inage, Shinichi
    BIOSYSTEMS, 2023, 232
  • [48] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [49] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [50] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235