826 511

Cited 0 times in

A Study on Comparison of Generalized Kappa Statistics in Agreement Analysis

 Kim Min-Seon  ;  Song Ki-Jun  ;  Nam Chung-Mo  ;  Jung In-Kyung 
 Korean Journal of Applied Statistics (응용통계연구), Vol.25(5) : 719-731, 2012 
Journal Title
 Korean Journal of Applied Statistics (응용통계연구) 
Issue Date
Agreement ; generalized kappa ; marginal probability distribution.
Agreement analysis is conducted to assess reliability among rating results performed repeatedly on the same subjects by one or more raters. The kappa statistic is commonly used when rating scales are categorical. The simple and weighted kappa statistics are used to measure the degree of agreement between two raters, and the generalized kappa statistics to measure the degree of agreement among more than two raters. In this paper, we compare the performance of four different generalized kappa statistics proposed by Fleiss (1971), Conger (1980), Randolph (2005), and Gwet (2008a). We also examine how sensitive each of four generalized kappa statistics can be to the marginal probability distribution as to whether marginal balancedness and/or homogeneity hold or not. The performance of the four methods is compared in terms of the relative bias and coverage rate through simulation studies in various scenarios with different numbers of raters, subjects, and categories. A real data example is also presented to illustrate the four methods.
Files in This Item:
T201203721.pdf Download
Appears in Collections:
1. College of Medicine (의과대학) > Dept. of Biomedical Systems Informatics (의생명시스템정보학교실) > 1. Journal Papers
1. College of Medicine (의과대학) > Dept. of Preventive Medicine and Public Health (예방의학교실) > 1. Journal Papers
Yonsei Authors
Nam, Chung Mo(남정모) ORCID logo https://orcid.org/0000-0003-0985-0928
Song, Ki Jun(송기준) ORCID logo https://orcid.org/0000-0003-2505-4112
Jung, Inkyung(정인경) ORCID logo https://orcid.org/0000-0003-3780-3213
사서에게 알리기


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.