Question on James Norris' Notes on Advanced Probability

199 Views Asked by At

This was talked about in the notes (see for instance D. Chua's notes for the course taught by M. Lis, which was based on the same class), and I don't understand why it is true.

Basically, suppose $(X, W)$ are Gaussian random variables, and our goal (quoting D. Chua's notes) is to compute the conditional expectation $E(X|W)$. Suppose we random variable $Y$ such that $EX = EY$, $X - Y$ is independent of $W$, and $Y$ is $W$-measurable, then $Y = E(X|W)$, since $E(X - Y)1_A = 0$ for all $\sigma(W)$-measurable $A$.

My question is: I know we are supposed to use $X - Y$ being independent of $W$ together with $EX = EY$ over entire probability space $\Omega$ to show $E(X - Y) 1_A = 0$ for all $\sigma(W)$-measurable $A$, where $1_A$ is the indicator function, but I cannot figure out how to do it.

1

There are 1 best solutions below

2
On

For your question the answer is $I_A$ is independent of $X-W$ so $E(X-W)I_A=E(X-W)P(A)=0$.

However, you don't have to go through the definition of conditional expectation to find $E(X|W)$. Choose $a$ such that $E[(X-aW)W]-E(X-aW)EW=0$. Then $E(X|W)=E(X-aW)|W)+E(aW|W)=EX-aEW+aW$ because $X-aW$ and $W$ are independent. [ If jointly normal random variables have covariance $0$ then they are independent].