Conditional expectations and random vectors.

206 Views Asked by At

Let $(\Omega, \sigma,P)$ a probability space and $Y$ a random variable on it, with $E|Y|<\infty$. Let $X_1,X_2$ random vectors with $\sigma(Y,X_1)$ independent of $\sigma(X_2)$. The problem is to prove that $$E(Y|X_1,X_2)=E(Y|X_1) \hspace{0.5cm} \text{almost surely}$$ It is enough to show that $$\int_D E(Y|X_1)dP=\int_DYdP \hspace{0.5cm} \forall D \in \sigma(X_1,X_2)$$ This equality is clearly true $\forall D \in \sigma(X_1)$, but what about the rest of the sets in $\sigma(X_1,X_2)$?

1

There are 1 best solutions below

0
On BEST ANSWER

The next step is to show the identity when $D$ is of the form $D_1\cap D_2$, with $D_i\in\sigma(X_i)$. This can be done using independence. Then define $\mathcal P$ as the collection of such sets. This is by definition stable under finite intersections, and conclude by Dynkin's $\pi$-$\lambda$ theorem.