Abstract
A previously described coefficient of agreement for nominal scales, kappa, treats all disagreements equally. A generalization to weighted kappa (Kw) is presented. The Kw provides for the incorpation of ratio-scaled degrees of disagreement (or agreement) to each of the cells of the k * k table of joi
Keywords
MeSH Terms
Related Publications
Observer Reliability and Agreement
Abstract The terms observer reliability and observer agreement represent different concepts. Reliability coefficients express the ability to differentiate between subjects. Agre...
Causal Variables, Indicator Variables and Measurement Scales: An Example from Quality of Life
Summary There is extensive literature on the development and validation of multi-item measurement scales. Much of this is based on principles derived from psychometric theory an...
Publication Info
- Year
- 1968
- Type
- article
- Volume
- 70
- Issue
- 4
- Pages
- 213-220
- Citations
- 8225
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1037/h0026256
- PMID
- 19673146