Conditional expectation of a random variable conditioned to another conditionally independent variable

83 Views Asked by At

I have a two variables $X$ and $Y$ conditionally independent given a thirs variable $Z$.

Now, assume that $Z$ can take values in ${1,...,k}$.

I will have: $$ E[XY] = \sum_{i=1}^k P[Z = i]E[X|Z = i]E[Y|Z = i] $$

Now, can I say what follows? $$ E[X | Y] = E[X] $$

I have a proof for this, but I am not sure: $$ E[X | Y] = \sum_{i=1}^k P[Z = i]E[X|Z = i, Y] = \sum_{i=1}^k P[Z = i]E[X|Z = i] = E[X] $$

Is this correct? If not, where am I wrong?

1

There are 1 best solutions below

0
On

This was trickier than it looked ... to me. It can be put in a more general-compact form thus:

Let $X,Y$ be conditionally independent given $Z$; that is, $P(X ,Y \mid Z ) = P(X \mid Z) P(Y \mid Z)$. Then $$E[X \mid Y] = E[E[X \mid Y, Z]] = E[E[X \mid Z]] = E[X] \tag{1}$$ where in the first and third equality we have applied total expectation (in the third to $X$, in the first to $X \mid Y$), and in the second we've applied the conditional independence.

This is wrong, of course (it would imply that for any Markov chain $X \to Z \to Y$ the extremes $X,Y$ are uncorrelated - false in general).

The wrong equality is the first one. Because $X \mid Y$ is not a random variable. Actually it's... nothing.

In your question, the first equality in your "proof" is (like my first equality in $(1)$) wrong, it's a mistaken application of the law of total expectation.