1104
Comment:
|
90
This is an atrclie that makes you think "never thought of that!"
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
== Kappa statistic evaluation in SPSS == SPSS syntax available: * [:FAQ/kappa/kappans:Non-square tables where one rater does not give all possible ratings] * [:FAQ/kappa/multiple:More than 2 raters] * [:FAQ/ad:An inter-rater measure based on Euclidean distances] '''Note:''' Reliability as defined by correlation coefficients (such as Kappa) requires variation in the scores to acheive a determinate result. If you have a program which produces a determinate result when the scores of one of the coders is constant, the bug is in that program, not in SPSS. Each rater must give at least two ratings. * [:FAQ/kappa/magnitude:Benchmarks for suggesting what makes a high kappa] There is also a weighted kappa which allows different weights to be attached to misclassifications. Warrens (2011) shows that weighted kappa is an example of a more general test of randomness. __Reference__ Warrens MJ (2011). Chance-corrected measures for 2 × 2 tables that coincide with weighted kappa. ''British Journal of Mathematical and Statistical Psychology'' '''64(2)''' 355–365. |
This is an atrclie that makes you think "never thought of that!" ---- CategoryHomepage |