If model $M_1$ has higher accuracy than model $M_2$, the kappa of $M_1$ will always be higher than the kappa of $M_2$?
I want to proof or refute (an example for the same case where increasing accuracy provokes a decrease in kappa is enough) it.
As requested I will reference concepts:
Accuracy: https://en.wikipedia.org/wiki/Evaluation_of_binary_classifiers
Kappa: https://en.wikipedia.org/wiki/Cohen%27s_kappa
Models: https://en.wikipedia.org/wiki/Machine_learning
Both accuracy and kappa are pretty common concepts in machine learning. I just want to demonstrate or refute the above question.
A new answer since, as @vbn pointed out, I mislabeled my computations. (I could have edited my former answer if I hadn't deleted it too quickly).
The answer is No, you can even have smaller kappa despite having bigger accuracy :
Consider a population of 100 animals, 60 dogs and 40 cats. Say $M_1$ makes 25 correct dog predictions and 35 correct cat predictions. Write $p_o$ for the relative observed agreement among raters, and $p_e$ for the hypothetical probability of chance agreement. One gets $$ p_o(M_1) = \frac{25 + 35}{100} = 0.6 $$ $$ p_e(M_1) = \frac{(25+35)(25+5) + (35+35)(5+35)}{100^2} = 0.46 $$ Hence, $\kappa(M_1) = \frac{p_o(M_1) - p_e(M_1)}{1 - p_e(M_1)} \simeq 0.26$.
Now, $M_2$ makes 45 correct dog predictions and 16 correct cat predictions. The accuracy of M2 is slightly better, but the kappa drops : $$p_0(M_2) = \frac{45 + 16}{100} = 0.61$$ $$p_e(M_2) = \frac{(45+15)(45+24) + (15+16)(24+16)}{100^2} = 0.538$$ Hence, $\kappa(M_2) \simeq 0.16$.