Wrong Cohen's kappa obtained from scorer node.


Using the attached confusion matrix I obtain a kappa value equal to 0.51. The expected accuracy is equal to 0.51 and the observed accuracy is equal to 0.76.  However, the scorer node output is kappa=-3.009.

What is wrong? Is there any bug related to the scorer node and Cohen's kappa calculation?




Yes, there was a bug for certain datasets in the computation of Cohen's Kappa. It will be fixed in 2.12.1 (to be released soon). Thanks for reporting it.

Cheers, gabor