Show that $E(X\mid Y, Z) = E(X\mid Y)$ almost surely with condition Z is independent of $(X, Y)$

273 Views Asked by At

$(X, Y, Z)$ is a continuous random vector and $Z$ is independent of $(X,Y)$. Prove that $E(X\mid Y, Z) = E(X\mid Y)$ almost surely.

I had been thinking this question tonight but couldn't figure out how to apply the independent condition. I was trying to show $E(X\mid Y,Z)$ is another version of $E(X\mid Y)$ but came up with nothing.

More detail, by the definition of conditional expectation as a random variable, \begin{equation*}E((X - E(X\mid Y,Z)\cdot H(Y,Z))=0\end{equation*} for every function $H$. Now I want to show $E(X\mid Y,Z)$ is another version of $E(X\mid Y)$ and then by the uniqueness of conditional expectation, I can get the desired result. Also we can write\begin{equation*}E((X - E(X\mid Y,Z)\cdot h(Y))=0\end{equation*} for every function $h$ since $h(Y)$ is included in $H(Y,Z)$. But $E(X\mid Y,Z)$ is function of $(Y,Z)$ instead of $Y$ which means it is still not a version of $E(X\mid Y)$. And it occurs to me, I didn't use the independent condition and that is where I stuck.

I think it remains to show $E(X\mid Y,Z)$ is a function of $y$ alone, which should be true intuitively since $Z$ is independent of $(X,Y)$. (i.e knowing $Z$ makes no contribution to know $X$)

But how to do it rigoriously?

1

There are 1 best solutions below

0
On BEST ANSWER

Hint: Consider the class $\mathcal H$ of measurable bounded functions $H$ such that $$E((X-E(X\mid Y))\cdot H(Y,Z))=0.$$ By hypothesis, $\mathcal H$ contains every measurable bounded function $H:(y,z)\mapsto h(y)$. On the other hand, the desired conclusion holds if $\mathcal H$ contains every measurable bounded function $H:(y,z)\mapsto H(y,z)$.

Hence, the next step is to expand the collection of functions whom we know are in $\mathcal H$. One might start by sutying what happens for measurable bounded functions $H:(y,z)\mapsto h(y)k(z)$.