$E[|X|\ln(1+|X|)]<+\infty \implies \lim_k E[X|\mathcal{G}_k]=0$ a.s

83 Views Asked by At

$X$ is a random variable such that $E[X]=0,$ $E[|X|\ln(1+|X|)]<+\infty.$ If $(\mathcal{G}_k)_k$ is a sequence of independent $\sigma$-algebras of events, then prove that $E[X|\mathcal{G}_k]$ converges a.s to $0$.

Any suggestions how to relate the result with the assumption?

1

There are 1 best solutions below

0
On BEST ANSWER

[Gabriel Romon has made a perceptive comment.]

Define $\mathcal F_n:=\sigma(\mathcal G_n,\mathcal G_{n+1},\ldots)$ and $X_n:=E[X\mid\mathcal F_n]$. Notice that $\mathcal T:=\cap_n\mathcal F_n$ is trivial by the Kolmogorov $0$-$1$ law. Moreover, $(X_n)$ is a reverse martingale with respect to the (decreasing) filtration $(\mathcal F_n)$. Therefore $\lim_n X_n$ exists a.s. and in $L^1$. The limit is $\mathcal T$-measurable, hence constant a.s.; that constant is $0$ because $E[X_n]=E[X]=0$.

Define $M:=\sup_{k\ge 1}|X_k|$. By the $L^1$ form of Doob's maximal inequality, $$ E[M]\le C\cdot E[|X_1|\ln(1+|X_1|)]. $$ But $\phi: x\mapsto x\ln(1+x)$ is convex for $x\ge 0$, so by Jensen's inequality $$ \phi(X_1)\le E[\phi(X)\mid\mathcal F_1] $$ and so $$ E[|X_1|\ln(1+|X_1)]\le E[|X|\ln(1+|X|)]<\infty. $$

Finally, define $\mathcal H_n:=\sigma(\mathcal G_1,\mathcal G_2,\ldots,\mathcal G_n)$ and $Y_n:=E[X\mid\mathcal G_n]$. Using the independence of the $\mathcal G_k$s it's not hard to check that $$ E[X_n\mid\mathcal H_n]=Y_n,\qquad n\ge 1. $$ By Hunt's lemma (the sequence $(X_n)$ converges a.s. to $0$ and $|X_n|\le M\in L^1$ for each $n$) we therefore have $$ \lim_n Y_n=\lim_nE[X_n\mid\mathcal H_n]=0,\qquad \hbox{a.s.} $$