Conditional Expectation and Variance inequality

479 Views Asked by At

I'm stuck trying to solve an inequality: Let $X \in \mathcal{L}^{2}$, then

$$\mathbb{E}[\vert X-\mathbb{E}[X\mid \mathcal{G}]\vert^{2}] \leq \mathbb{E}[\vert X - \mathbb{E}[X]\vert^{2}].$$

My attempts:

First I (mistakenly?) thought that $\mathbb{E}[\vert X-\mathbb{E}[X]\vert^{2}] = Var(X)$ and try to use law of total variance without any success. Then I already proved that if $\mathcal{G}\subset \mathcal{F}$ and $\mathbb{E}[X^{2}]<\infty$ then

$$\mathbb{E}[(X-\mathbb{E}[X\vert \mathcal{F}])^{2}] + \mathbb{E}[(\mathbb{E}[X\vert \mathcal{F}] - \mathbb{E}[X\vert \mathcal{G}])^{2}] = \mathbb{E}[(X-\mathbb{E}[X\vert \mathcal{G}])^{2}].$$

I'm pretty sure the inequality is a corollary of the previous theorem using the second left term of the equality is zero, but I'm stuck to go further than that poor argument. Also I'm pretty sure that I need to use the trivial $\sigma-$algebra. I would appreciate any ideas.

2

There are 2 best solutions below

0
On

You are not mistaken about $\mathbf{Var}[X] = \mathbf{E}\left[\left(X - \mathbf{E}\left[X\right]\right)^2\right]$, just develop the square etc to find that it is the same as $\mathbf{E}\left[X^2\right] - \mathbf{E}\left[X\right]^2$.

The expectation of $X$ (which is $\mathbf{L}^2$) is the orthogonal projection of $X$ on the subspace $V_0$ of $\mathbf{L}^2$ consisting in random variables that are constant. (Just minimize $c\mapsto \mathbf{E}\left[(X-c)^2\right])$ by calculating the derivative.)

The conditional expectation of $X$ w.r.t. $\mathscr{G}$ is the orthogonal projection of $X$ on the (closed) subspace $\mathbf{L}^2 (\mathscr{G})$ of $\mathbf{L}^2$ consisting in $\mathscr{G}$-measurable random variables.

As obviously $V_0$ is a subspace of $\mathbf{L}^2 (\mathscr{G})$ and as the orthogonal projection of $X$ on a closed subspace $V$ is but the infimum of the norm $\|X - Y\|_{\mathbf{L}^2}$ for $Y \in V$, you get your inequality by the fact that the bigger the space the lesser the infimum.

Remark. The expectation of $X$ is also the conditional expectation of $X$ w.r.t. the trivial sub-algebra $\{\varnothing, \Omega\}$, provided that the underlying set of your probability space is $\Omega$.

5
On

Let $Y=E(X|\mathcal G)$. Then $E(X-EX)^{2}=E((X-Y)+(Y-EX))^{2}$. Expand this as $E(X-Y)^{2}+E(Y-EX)^{2}+2E(X-Y)(Y-EX) \geq E(X-Y)^{2}+2E(X-Y)(Y-EX)$ If we show that the last term is $0$ we get the required in equality. Since $Y-EX$ is measureble w.r.t. $\mathcal G$ we can write $E(X-Y)(Y-EX)=E[E(X-Y)(Y-EX)|\mathcal G)] =E ((Y-EX)[E(X-Y)|\mathcal G])$ and $E(X-Y)|\mathcal G)=E(X|\mathcal G)-Y=0$.