
Cohen's Kappa
Cohen’s Kappa is a statistic used to measure how much two people agree on their assessments or classifications, beyond what we'd expect by chance. It accounts for chance agreement, providing a more accurate picture of true agreement. A Kappa of 1 indicates perfect agreement, 0 means agreement is no better than random chance, and negative values suggest disagreement. It's useful in areas like medical diagnoses, quality control, or survey analysis when evaluating consistency between raters. Overall, it helps determine if different observers are reliably interpreting data or making similar evaluations.