Close

Agreement For Categorical Data Kappa

Kalantri et al. studied the accuracy and reliability of pallor as a tool for detecting anemia. [5] They concluded that « clinical evaluation of pallor may exclude severe anemia and decide modestly. » However, the correspondence between observers for the detection of pallor was very poor (kappa = 0.07 for connective blues and 0.20 for tongue blues), meaning that pallor is an unreliable sign for the diagnosis of anemia. Cohen`s Kappa measures the concordance between two evaluators who divide each of the N elements into mutually excluded C categories. The definition of κ {textstyle kappa } is as follows: weighted kappa allows disagreements to be weighed differently[21] and is particularly useful when codes are arranged. .

Tagged: