Why Cohen's Kappa should be avoided as performance measure in classification

被引:177
|
作者
Delgado, Rosario [1 ]
Tibau, Xavier-Andoni [2 ]
机构
[1] Univ Autonoma Barcelona, Dept Math, Campus UAB, Cerdanyola Del Valles, Spain
[2] Univ Autonoma Barcelona, Adv Stochast Modelling Res Grp, Campus UAB, Cerdanyola Del Valles, Spain
来源
PLOS ONE | 2019年 / 14卷 / 09期
关键词
INTEROBSERVER AGREEMENT; RELIABILITY; ASSOCIATION; MODELS; COEFFICIENT; PREVALENCE;
D O I
10.1371/journal.pone.0222916
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
We show that Cohen's Kappa and Matthews Correlation Coefficient (MCC), both extended and contrasted measures of performance in multi-class classification, are correlated in most situations, albeit can differ in others. Indeed, although in the symmetric case both match, we consider different unbalanced situations in which Kappa exhibits an undesired behaviour, i.e. a worse classifier gets higher Kappa score, differing qualitatively from that of MCC. The debate about the incoherence in the behaviour of Kappa revolves around the convenience, or not, of using a relative metric, which makes the interpretation of its values difficult. We extend these concerns by showing that its pitfalls can go even further. Through experimentation, we present a novel approach to this topic. We carry on a comprehensive study that identifies an scenario in which the contradictory behaviour among MCC and Kappa emerges. Specifically, we find out that when there is a decrease to zero of the entropy of the elements out of the diagonal of the confusion matrix associated to a classifier, the discrepancy between Kappa and MCC rise, pointing to an anomalous performance of the former. We believe that this finding disables Kappa to be used in general as a performance measure to compare classifiers.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] Cohen's Kappa Coefficient as a Performance Measure for Feature Selection
    Vieira, Susana M.
    Kaymak, Uzay
    Sousa, Joao M. C.
    2010 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE 2010), 2010,
  • [2] Interval estimation for Cohen's kappa as a measure of agreement
    Blackman, NJM
    Koval, JJ
    STATISTICS IN MEDICINE, 2000, 19 (05) : 723 - 741
  • [3] Why representativeness should be avoided
    Rothman, Kenneth J.
    Gallacher, John E. J.
    Hatch, Elizabeth E.
    INTERNATIONAL JOURNAL OF EPIDEMIOLOGY, 2013, 42 (04) : 1012 - 1014
  • [4] Cohen's kappa- a measure of conformity between observers
    Lydersen, Stian
    TIDSSKRIFT FOR DEN NORSKE LAEGEFORENING, 2018, 138 (05) : 467 - 467
  • [5] Cohen's Kappa Coefficient as a Measure to Assess Classification Improvement following the Addition of a New Marker to a Regression Model
    Wieckowska, Barbara
    Kubiak, Katarzyna B.
    Jozwiak, Paulina
    Moryson, Waclaw
    Stawinska-Witoszynska, Barbara
    INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (16)
  • [6] Comparison of classification accuracy using Cohen's Weighted Kappa
    Ben-David, Arie
    EXPERT SYSTEMS WITH APPLICATIONS, 2008, 34 (02) : 825 - 832
  • [7] A New Measure of Agreement to Resolve the Two Paradoxes of Cohen's Kappa
    Park, Mi-Hee
    Park, Yong-Gyu
    KOREAN JOURNAL OF APPLIED STATISTICS, 2007, 20 (01) : 117 - 132
  • [8] Why the concept "lifestyle diseases" should be avoided
    Vallgarda, Signild
    SCANDINAVIAN JOURNAL OF PUBLIC HEALTH, 2011, 39 (07) : 773 - 775
  • [9] A Simplified Cohen'S Kappa for Use in Binary Classification Data Annotation Tasks
    Wang, Juan
    Yang, Yongyi
    Xia, Bin
    IEEE ACCESS, 2019, 7 : 164386 - 164397
  • [10] New Interpretations of Cohen's Kappa
    Warrens, Matthijs J.
    JOURNAL OF MATHEMATICS, 2014, 2014