Computing inter-rater reliability and its variance in the presence of high agreement

Pi (π) and kappa (κ) statistics are widely used in the areas of psychiatry and psychological testing to compute the extent of agreement between raters on nominally scaled data. It is a fact that these coefficients occasionally yield unexpected results in situations known as the paradoxes of kappa. T...

Full description

Saved in:
Bibliographic Details
Published in:British journal of mathematical & statistical psychology Vol. 61; no. 1; pp. 29 - 48
Main Author: Gwet, Kilem Li
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.05.2008
British Psychological Society
Subjects:
ISSN:0007-1102, 2044-8317
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Pi (π) and kappa (κ) statistics are widely used in the areas of psychiatry and psychological testing to compute the extent of agreement between raters on nominally scaled data. It is a fact that these coefficients occasionally yield unexpected results in situations known as the paradoxes of kappa. This paper explores the origin of these limitations, and introduces an alternative and more stable agreement coefficient referred to as the AC1 coefficient. Also proposed are new variance estimators for the multiple‐rater generalized π and AC1 statistics, whose validity does not depend upon the hypothesis of independence between raters. This is an improvement over existing alternative variances, which depend on the independence assumption. A Monte‐Carlo simulation study demonstrates the validity of these variance estimators for confidence interval construction, and confirms the value of AC1 as an improved alternative to existing inter‐rater reliability statistics.
Bibliography:ArticleID:BMSP253
istex:29A04BD84BF5C2B0E9624DF373427AB30F020AA5
ark:/67375/WNG-15CDWBWT-J
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0007-1102
2044-8317
DOI:10.1348/000711006X126600