Conditional expectation of iid random variables measure theory

82 Views Asked by At

2 fair coins are tossed independently $X_{1,2}$. $X_1$ is $\sigma(F)$ measurable and $X_2$ is $\sigma(G)$ measurable. We are asked to show $E(X_2\mid X_1) = E(X_2)$. I'm trying this via the $\omega$ wise definition of conditional expectation and am getting a bit stuck somewhere, as shown below:

Given: $E(X_i) = \frac{1}{2}$

The $\sigma$-algebra $F$ is partitionable by $\Lambda_n$ and so we can write the conditional as:

$E(X_2\mid X_1) = \sum_n E(X_2\mid \Lambda_n)1_{\Lambda_n}$ where $\Lambda_n$ takes on values of only {$X_1 = 1$ or $X_1 = 0$}

Thus I expand the summation as follows:

$\sum \int_{X_1 = 1} X_2\,dP\cdot\frac{1}{P(X_1 = 1)} + \int_{X_1 = 0} X_2\,dP\cdot\frac{1}{P(X_1 = 0)}$

My calculation of the integrals, which are equal is as follows:

$\int_{X_1 = 1} X_2\,dP\cdot\frac{1}{P(X_1 = 1)} = (X_2(w)=1)\cdot (P(X_2 = 1 \And P(X_1=1) = 1\cdot\frac{1}{4} \cdot \frac{1}{P(X_1 = 1)} = \frac{1}{2}$.

Thus if you sum both integrals I arrive at 1 but that is incorrect as clearly the $E(X_2) = .5$ and I am unclear where the error is in the above $\omega$ wise argument.

1

There are 1 best solutions below

6
On BEST ANSWER

If you are using the $L^1$ framework of conditional expectations, given a probability space $(\Omega,\mathscr{F},P)$ and a sub-$\sigma$-algebra $\mathscr{G}\subseteq \mathscr{F}$, to prove that a rv $Y$ is s.t. $E[X|\mathscr{G}]=Y$ $P$-a.s. you have to show that the three properties of conditional expectations hold: $(1)$. That $Y$ is $\mathscr{G}/\mathcal{B}(\mathbb{R})$-measurable; $(2)$. that $E[|Y|]< \infty$; $(3)$. That $E[Y\mathbf{1}_G]=E[X\mathbf{1}_G],\,\forall G \in \mathscr{G}$.

In this case we have $\mathscr{G}=\sigma(X_1)$ and $X_1 \perp X_2$ by assumption. We have $Y=E[X_2]=1/2$. As $Y$ is surely constant, it is always measurable wrt any $\sigma$-algebra, and it is also integrable. Now consider that $X_1$ is a Bernoulli trial so $$\sigma(X_1)=\{\emptyset,\{X_1=1\},\{X_1=0\},\Omega\}$$ Indeed for $j \in \{0,1\}$ $$E[Y\mathbf{1}_{\{X_1=j\}}]=(1/2)P(X_1=j)$$ $$E[X_2\mathbf{1}_{\{X_1=j\}}]\stackrel{\textrm{Indep.}}{=}E[X_2]P(X_1=j)=(1/2)P(X_1=j)$$ so $Y=(1/2)=E[X_2]$ is a version of $E[X_2|X_1]$.


If we use the usual conditional expectation of a simple rv given another simple rv, we get $$E[X_2|\sigma(X_1)](\omega)=E[X_2|X_1=1]\mathbf{1}_{\{X_1=1\}}(\omega)+E[X_2|X_1=0]\mathbf{1}_{\{X_1=0\}}(\omega)$$ (this can be proved using the method above for general simple functions). Notice that $$E[X_2|X_1=j]=\sum_{k=0}^1kP(X_2=k|X_1=j)=\sum_{k=0}^1kP(X_2=k)=\frac{1}{2}$$ so $E[X_2|X_1=j]=E[X_2]$ $P$-a.s. and this gives us $$\begin{aligned}E[X_2|\sigma(X_1)](\omega)&=E[X_2]\mathbf{1}_{\{X_1=1\}}(\omega)+E[X_2]\mathbf{1}_{\{X_1=0\}}(\omega)=\\ &=(1/2)(\mathbf{1}_{\{X_1=1\}}(\omega)+\mathbf{1}_{\{X_1=0\}}(\omega))=(1/2)\end{aligned}$$