I know that $E(X) = E[E(X|Y)]$, but what if you start with $E(X|Y)$?
Is $E(X|Y) = E[E(X|Y,Z)]$ or is $E(X|Y) = E[E(X|Y) | Z]$? Or are both statements equivalent?
I know that $E(X) = E[E(X|Y)]$, but what if you start with $E(X|Y)$?
Is $E(X|Y) = E[E(X|Y,Z)]$ or is $E(X|Y) = E[E(X|Y) | Z]$? Or are both statements equivalent?
Copyright © 2021 JogjaFile Inc.
Both equations you wrote are false in general.
$E[X\mid Y]\neq E[E[X\mid Y,Z]]$. Instead, $\color{green}{E[X]}=E[E[X\mid Y,Z]]$. In general, the law of total expectation says that $$E[E[X\mid \text{anything}\,]]=E[X]$$
$E[E[X|Y]\mid Z]\neq E[X|Y]$. The two sides are not related at all in general, since $E[E[X|Y]\mid Z]$ is a function of $Z$, while $E[X|Y]$ is a function of $Y$.
One thing you can say is that $$ E[E[X\mid Y,Z]\mid Z]=E[X\mid Z]\tag{1} $$ This follows from a general fact that for $\sigma$-algebras $ \newcommand{\s}{\mathcal{F}_{\text{small}}} \newcommand{\b}{\mathcal{F}_{\text{big}}}\s$ and $\b$ such that $\s \subseteq \b$, then $$ E[E[X\mid \s]\mid \b]=E[E[X \mid \b]\mid \s]=E[X\mid \s] $$ You use this to prove $(1)$ by letting $\s=\sigma(Z)$ and $\b=\sigma(Y,Z)$.
Alternatively, we can give a proof of $(1)$ in the special case where $(X,Y,Z)$ are jointly continuous, with a pdf $f(x,y,z)$: \begin{align} E[X\mid Y=y,Z=z] &={\int x\cdot f(x,y,z)\,dx\over \int f(x,y,z)\,dx}, \\&\Downarrow\\ E[\color{blue}{E[X\mid Y,Z=z]}\mid Z=z] &=\iint \color{blue}{{\int x\cdot f(x,y,z)\,dx\over \int f(x,y,z)\,dx}} \cdot f(x,y,z)\,dx\,dy \\ &=\int \left(\int x\cdot f(x,y,z)\,dx\right) \color{gray}{\int f(x,y,z)\,dx \over \int f(x,y,z)\,dx}\,dy \\ &=\iint x\cdot f(x,y)\,dx\,dy \\&=E[X\mid Z=z] \end{align} You can give a similar proof in the case where $X,Y,Z$ are jointly discrete, with a joint probability mass function $f(x,y,z)=P(X=x,Y=y,Z=z)$, for $(x,y,z)$ ranging over some countable support set. Basically, you do this by replacing $\int$ with $\sum$ in the proof above.