A Study on Comparison of Generalized Kappa Statistics in Agreement Analysis

被引:2
|
作者
Kim, Min Seon [1 ]
Song, Ki Jun [1 ]
Nam, Chung Mo [1 ]
Jung, Inkyung [1 ]
机构
[1] Yonsei Univ, Dept Biostat, Coll Med, 50 Yonsei Ro, Seoul 120752, South Korea
关键词
Agreement; generalized kappa; marginal probability distribution;
D O I
10.5351/KJAS.2012.25.5.719
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Agreement analysis is conducted to assess reliability among rating results performed repeatedly on the same subjects by one or more raters. The kappa statistic is commonly used when rating scales are categorical. The simple and weighted kappa statistics are used to measure the degree of agreement between two raters, and the generalized kappa statistics to measure the degree of agreement among more than two raters. In this paper, we compare the performance of four different generalized kappa statistics proposed by Fleiss (1971), Conger (1980), Randolph (2005), and Gwet (2008a). We also examine how sensitive each of four generalized kappa statistics can be to the marginal probability distribution as to whether marginal balancedness and/or homogeneity hold or not. The performance of the four methods is compared in terms of the relative bias and coverage rate through simulation studies in various scenarios with different numbers of raters, subjects, and categories. A real data example is also presented to illustrate the four methods.
引用
收藏
页码:719 / 731
页数:13
相关论文
共 50 条
  • [1] Interpretation of Kappa and B statistics measures of agreement
    Munoz, SR
    Bangdiwala, SI
    JOURNAL OF APPLIED STATISTICS, 1997, 24 (01) : 105 - 111
  • [2] TABLES OF GENERALIZED KAPPA-STATISTICS
    ABDELATY, SH
    BIOMETRIKA, 1954, 41 (1-2) : 253 - 260
  • [3] Assessing agreement with multiple raters on correlated kappa statistics
    Cao, Hongyuan
    Sen, Pranab K.
    Peery, Anne F.
    Dellon, Evan S.
    BIOMETRICAL JOURNAL, 2016, 58 (04) : 935 - 943
  • [4] Regard to assessing agreement between two raters with kappa statistics
    Yu, Tianfei
    Ren, Bingrui
    Li, Ming
    INTERNATIONAL JOURNAL OF CARDIOLOGY, 2024, 403
  • [5] Response to: Regard to assessing agreement between two raters with kappa statistics
    Paratz, Elizabeth D.
    Stub, Dion
    Sutherland, Nigel
    Gutman, Sarah
    La Gerche, Andre
    Mariani, Justin
    Taylor, Andrew
    Ellims, Andris
    INTERNATIONAL JOURNAL OF CARDIOLOGY, 2024, 404
  • [6] Kappa statistics applied to evaluate the intra- and interexaminers agreement.
    Neves, LHM
    Dini, EL
    Brandao, IMG
    JOURNAL OF DENTAL RESEARCH, 1997, 76 (05) : 965 - 965
  • [7] Generalized Agreement Statistics over Fixed Group of Experts
    Shah, Mohak
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT III, 2011, 6913 : 191 - 206
  • [8] Interrater Agreement Statistics With Skewed Data: Evaluation of Alternatives to Cohen's Kappa
    Xu, Shu
    Lorber, Michael F.
    JOURNAL OF CONSULTING AND CLINICAL PSYCHOLOGY, 2014, 82 (06) : 1219 - 1227
  • [9] The statistical analysis of kappa statistics in multiple samples
    Donner, A
    Klar, N
    JOURNAL OF CLINICAL EPIDEMIOLOGY, 1996, 49 (09) : 1053 - 1058
  • [10] Use of kappa statistics in the analysis of research data
    Asokan, GV
    Prabhakaran, V
    Doraiswami, J
    INDIAN VETERINARY JOURNAL, 2001, 78 (12): : 51 - 53