Limit of a sequence of conditional expectations

318 Views Asked by At

I was reading a proof of the property of conditional expectations that says that if $X_n \uparrow X$ then $\mathbb E[X_n|\mathcal G] \uparrow E[X|\mathcal G]$ in a given probability space $(\Omega, \mathcal F, \mathbb P)$ where $\mathcal G \subset \mathcal F$. How can we ensure that $\lim_{n\to\infty} \mathbb E[X_n|\mathcal G]$ exists? I know that we have to use the monotonicity of the conditional expectations, but isn't it possible that such limit be infinity?

1

There are 1 best solutions below

0
On BEST ANSWER

Fix a probability triplet $(\Omega, F, P)$. Let $G$ be a sub sigma algebra of $F$.

Claim: If $X$ and $Y$ are random variables with finite expectations and satisfy $X\leq Y$ surely, and if $E[X|G]$ and $E[Y|G]$ are particular versions of the conditional expectations, then $E[X|G]\leq E[Y|G]$ almost surely.

Proof: We can define the nonnegative random variable $Y-X$ and define $W=E[Y|G]-E[X|G]$ as a version of the conditional expectation of $Y-X$ given $G$. So $W$ is $G$-measurable and $E[W1_A]=E[(Y-X)1_A]$ for all sets $A \in G$. Fix $\epsilon>0$ and define $A = \{W\leq-\epsilon\}$. Since $Y-X \geq 0$ surely we have \begin{align} 0 &\leq E[(Y-X)1_{\{W\leq-\epsilon\}}]\\ &= E[W1_{\{W\leq -\epsilon\}}]\\ &\leq -\epsilon E[1_{\{W\leq -\epsilon\}}]\\ &= -\epsilon P[W \leq -\epsilon] \end{align} Thus $\epsilon P[W\leq -\epsilon] \leq 0$ and so $P[W \leq -\epsilon] = 0$. This holds for all $\epsilon>0$. Observe that $$\{W\leq -1/n\} \nearrow \{W<0\}$$ and so by continuity of probability we have $$\lim_{n\rightarrow\infty} P[W\leq -1/n] = P[W<0]$$ Since $P[W\leq -1/n]=0$ for all positive integers $n$, we have $0=P[W<0]$. So $W$ is nonnegative almost surely. $\Box$


Using the claim we can prove that the result in question holds almost surely (it does not necessarily hold surely). Let $X, X_1, X_2, X_3, ...$ be random variables with finite expectations and assume $X_n\nearrow X$. Let $E[X|G], E[X_1|G], E[X_2|G], ...$ be particular versions of the conditional expectation.

It means that for all $n \in \{1, 2, 3, ...\}$ we have $X_n \leq X_{n+1}\leq X$ surely, and so the above claim implies $E[X_n|G]\leq E[X_{n+1}|G]$ almost surely and $E[X_n|G]\leq E[X|G]$ almost surely. Define $$ A_n = \{\omega \in \Omega: E[X_n|G](\omega) > E[X_{n+1}|G](\omega)\} \cup \{\omega \in \Omega : E[X_n|G](\omega) >E[X|G](\omega)\}$$ Then $P[A_n]=0$ for all $n \in \{1, 2, 3, ...\}$ and so $P[\cup_{n=1}^{\infty} A_n]=0$. Define $B=(\cup_{n=1}^{\infty} A_n)^c$. So $P[B]=1$ and for all $\omega \in B$ we know the sequence $\{E[X_n|G](\omega)\}_{n=1}^{\infty}$ is nondecreasing and upper-bounded by $E[X|G](\omega)$. So $\lim_{n\rightarrow\infty}E[X_n|G](\omega)$ exists and is less than or equal to $E[X_n|G](\omega)$ for all $\omega$ in the probability-1 set $B$.