Characterizing the Functional Density Power Divergence Class

被引:1
|
作者
Ray, Souvik [1 ]
Pal, Subrata [2 ]
Kar, Sumit Kumar [3 ]
Basu, Ayanendranath [4 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
[2] Iowa State Univ, Dept Stat, Ames, IA 50011 USA
[3] Univ North Carolina Chapel Hill, Dept Stat & Operat Res, Chapel Hill, NC 27599 USA
[4] Indian Stat Inst, Interdisciplinary Stat Res Unit, Kolkata 700108, W Bengal, India
关键词
Density power divergence; efficiency; logarithmic density power divergence; robust statistical inference; ROBUST; ENTROPY;
D O I
10.1109/TIT.2022.3210436
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Divergence measures have a long association with statistical inference, machine learning and information theory. The density power divergence and related measures have produced many useful (and popular) statistical procedures, which provide a good balance between model efficiency on one hand and outlier stability or robustness on the other. The logarithmic density power divergence, a particular logarithmic transform of the density power divergence, has also been very successful in producing efficient and stable inference procedures; in addition it has also led to significant demonstrated applications in information theory. The success of the minimum divergence procedures based on the density power divergence and the logarithmic density power divergence (which also go by the names beta-divergence and gamma-divergence, respectively) make it imperative and meaningful to look for other, similar divergences which may be obtained as transforms of the density power divergence in the same spirit. With this motivation we search for such transforms of the density power divergence, referred to herein as the functional density power divergence class. The present article characterizes this functional density power divergence class, and thus identifies the available divergence measures within this construct that may be explored further for possible applications in statistical inference, machine learning and information theory.
引用
收藏
页码:1141 / 1146
页数:6
相关论文
共 50 条
  • [31] Model Selection in a Composite Likelihood Framework Based on Density Power Divergence
    Castilla, Elena
    Martin, Nirian
    Pardo, Leandro
    Zografos, Konstantinos
    ENTROPY, 2020, 22 (03)
  • [32] Robust Active Learning for Linear Regression via Density Power Divergence
    Sogawa, Yasuhiro
    Ueno, Tsuyoshi
    Kawahara, Yoshinobu
    Washio, Takashi
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 594 - 602
  • [33] Robust estimation in generalized linear models: the density power divergence approach
    Abhik Ghosh
    Ayanendranath Basu
    TEST, 2016, 25 : 269 - 290
  • [34] Consistency of minimizing a penalized density power divergence estimator for mixing distribution
    Lee, Taewook
    Lee, Sangyeol
    STATISTICAL PAPERS, 2009, 50 (01) : 67 - 80
  • [35] Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator
    Castilla, Elena
    Martin, Nirian
    Pardo, Leandro
    Zografos, Konstantinos
    ENTROPY, 2018, 20 (01):
  • [36] Robust Regression with Density Power Divergence: Theory, Comparisons, and Data Analysis
    Riani, Marco
    Atkinson, Anthony C.
    Corbellini, Aldo
    Perrotta, Domenico
    ENTROPY, 2020, 22 (04)
  • [37] Robust estimation in generalized linear models: the density power divergence approach
    Ghosh, Abhik
    Basu, Ayanendranath
    TEST, 2016, 25 (02) : 269 - 290
  • [38] Robust empirical Bayes small area estimation with density power divergence
    Sugasawa, S.
    BIOMETRIKA, 2020, 107 (02) : 467 - 480
  • [39] Density functional reactivity theory characterizing the reactivity of frustrated Lewis pairs
    Wu, Dongling
    Liu, Anjie
    Jia, Dianzeng
    COMPUTATIONAL AND THEORETICAL CHEMISTRY, 2018, 1131 : 33 - 39
  • [40] Method of characterizing high energy density capacitors for power conditioning systems
    McDonald, D.J.
    Dollinger, R.
    Sarjeant, W.J.
    IEEE Conference Record of Power Modulator Symposium, 1988, : 345 - 348