variation of random variable greater or equal than sum sum variations of two independent conditional expectations

167 Views Asked by At

I found the following problem, which seems very simple, but I'm stuck concerning the ideas I can use. The statement is as follows:

Let $X$ be a random variable with finite expectation and let $\mathcal{G}_1$ and $\mathcal{G}_2$ be two $\sigma$-algebras which are independent of each other. Let $X_1=E[X|\mathcal{G}_1]$ and $X_2=E[X|\mathcal{G}_2]$. Show that $Var(X) \geq Var(X_1) + Var(X_2)$. Use an example to show that independence of $\mathcal{G}_1$ and $\mathcal{G}_2$ is crucial.

It's obvious that if $\mathcal{G}_1=\mathcal{G}_2$ and $X$ is $\mathcal{G}_1$-measurable the inequality does not hold.

If I use the law of total variation I can find one of the terms on the r.h.s., but I'm clueless regarding how to get from one sigma-algebra to the other and use their independence. I also thought about somehow using Jensen to get an inequality or conditioning w.r.t. $\sigma(\mathcal{G}_1, \mathcal{G}_2)$, but arrived nowhere.

Any clues? Help is greatly appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

Here is a solution based on $L_2$ projections. I assume neater solutions are possible.

Write $X = X_1 + (X - X_1)$ so that $E[X - X_1] = 0$ and $E[X_1 (X - X_1 )| G_1 ] = 0$ by the law of iterated expectations. Conclude that $Var(X) = Var (X_1) + Var( X - X_1 )$. Project again and write $X - X_1 = \{E[X|G_2] - E[X_1|G_2]\} + \{ X - X_1 - (E[X|G_2] - E[X_1|G_2]) \}$. Because the terms in braces are again uncorrelated, this implies $$ Var(X) = Var(X_1) + Var(X_2 - E[X_1|G_2]) + Var(X - X_1 - (E[X|G_2] - E[X_1|G_2])) . $$ By independence and the law of iterated expectations $E[X_1|G_2] = E[X_1] = E[X]$ a.s., so the second variance on the right-hand side of the display becomes $Var(X_2)$. The inequality now follows because the third term on the right-hand side of the display is positive.