Dispersion of X about its conditional mean decreases as the σalgebra grows

144 Views Asked by At

I am just starting to learn some probability theory, so I apologize in advance if this is a trivial question.

Suppose $E[X^2] < \infty$ and define $Var(X|G) = E[(X − E[X|G])^2 |G]$.

Prove that the dispersion of $X$ about its conditional mean decreases as the $\sigma-$algebra grows. Namely, show that for any two $\sigma-$ algebras $G_1 \subset G_2$, we have $E[Var(X|G_2)] ≤ E[Var(X|G_1)]$.

My attempt:

1.I tried to use $Var[X] = E[Var(X|G)] + Var[E(X|G)]$, and basically just compare $Var[E(X|G_1)]$ and $Var[E(X|G_2)]$, but I can't find a rigorous argument for which either of them is greater.

  1. I tried to explicitly write down the expectations and use something similar to the Radon-Nikodym theorem, but again I could not finish the argument.
1

There are 1 best solutions below

0
On

Since $X$ is square-integrable, we can interpret everything inside the Hilbert space $L^2(\Omega,\mathcal{F},\mathbb{P})$. The conditional expectation $\mathbb{E}[-\mid\mathcal{G}]$, where $\mathcal{G}\subseteq\mathcal{F}$, has the interpretation as the orthogonal projection $L^2(\Omega,\mathcal{F},\mathbb{P})\to L^2(\Omega,\mathcal{G},\mathbb{P}\vert_\mathcal{G})$.

Thus $X-\mathbb{E}[X\mid\mathcal{G}]$ is what you get by orthogonal projection onto the orthogonal complement $L^2(\mathcal{G})^\perp$. So its $L^2$-norm has is the distance of $X$ from the subspace $L^2(\mathcal{G})$, which therefore cannot increase as $\mathcal{G}$ grows because orthogonal projections are closest point projection.