Say that $X$ is a random variable with a known mean. We define $I$ to be the variable $I=1$ if $X\leq a$ and $I=0$ if $X> a$. How do I prove that their covariance has to be negative?
I begun from the formula for covariance and calculated, $$ Cov(X,I)=E(XI)-E(X)\mu_X-\mu_IE(X)+\mu_X\mu_I\\ =E(XI)-E(X)P(X\leq a)-E(X)P(X\leq a)+E(X)P(X\leq a)\\ =E(XI)-E(X)P(X\leq a) $$ I cannot prove that the final statement is negative and that is where I am stuck. Any hints or tips welcome!
A first hint would be to use in your last equality, the common equality$$X=X\cdot1_{X\leq a}+X\cdot1_{X> a}$$
where $$1_{X\leq a}=\begin{cases}1 \text{ if }X\leq a\\0\text{ else }\end{cases}$$ (its your $I$ )and $1_{X> a}$ is defined accordingly.
Edit: In addition, you can always use this kind of inequality: $$X\cdot1_{X\leq a}\leq a\cdot1_{X\leq a}$$ and so $$\mathbb{E}[X\cdot1_{X\leq a}]\leq a\mathbb{P}(X\leq a)$$ Or $$X\cdot1_{X> a}> a\cdot1_{X> a}$$ and so $$\mathbb{E}[X\cdot1_{X> a}]> a\mathbb{P}(X> a)$$
(you don't need both in your case)