I have a two variables $X$ and $Y$ conditionally independent given a thirs variable $Z$.
Now, assume that $Z$ can take values in ${1,...,k}$.
I will have: $$ E[XY] = \sum_{i=1}^k P[Z = i]E[X|Z = i]E[Y|Z = i] $$
Now, can I say what follows? $$ E[X | Y] = E[X] $$
I have a proof for this, but I am not sure: $$ E[X | Y] = \sum_{i=1}^k P[Z = i]E[X|Z = i, Y] = \sum_{i=1}^k P[Z = i]E[X|Z = i] = E[X] $$
Is this correct? If not, where am I wrong?
This was trickier than it looked ... to me. It can be put in a more general-compact form thus:
This is wrong, of course (it would imply that for any Markov chain $X \to Z \to Y$ the extremes $X,Y$ are uncorrelated - false in general).
The wrong equality is the first one. Because $X \mid Y$ is not a random variable. Actually it's... nothing.
In your question, the first equality in your "proof" is (like my first equality in $(1)$) wrong, it's a mistaken application of the law of total expectation.