Convergence of conditional expectations given a sequence of random variables

710 Views Asked by At

I am trying to prove the following statemt:

Let $(\Omega, \mathcal A, P)$ be a probability space. Let $(Z_n)_{n\in \mathbb N}$ be i.i.d. random variables with $Z_1 \in \mathcal L^1$. Let $\theta \in \mathcal L^1$ be independent from $(Z_n)_{n\in \mathbb N}$ and define $\forall n\in \mathbb N: Y_n := Z_n + \theta$. Then $$\mathbb E[\theta \mid Y_1, Y_2, \dots, Y_n] \longrightarrow \theta$$ $P$-a.s. as $n\to \infty$.

I was trying to apply Lévy's martingale convergence theorem: $$\mathbb E[\theta \mid Y_1, Y_2, \dots, Y_n] \longrightarrow \mathbb E[\theta \mid \mathcal F_{\infty}]$$ where $\mathcal F_\infty $ is the smallest $\sigma$-algebra generated by all $Y_n$. Now it remains to show $\theta = \mathbb E[\theta \mid \mathcal F_\infty]$, but here is where I am stuck. Is this the right direction I am going?

2

There are 2 best solutions below

3
On BEST ANSWER

A first remark is that without loss of generality, we can assume that $Y_i$ is centered.

It seems that we do not need $\theta$ to be independent of $\left(Y_i\right)_{i\geqslant 1}$. Moreover, we only need that $N^{-1}\mathbb E\left\lvert \sum_{i=1}^NY_i\right\rvert\to 0$ which would for example hold if $\left(Y_i\right)_{i\geqslant 1}$ is stationary and centered.

Fix an integer $N$. Writing $$ \theta=\frac 1N\sum_{i=1}^N\left(\theta+Y_i\right)-\frac 1N\sum_{i=1}^NY_i=:\theta_N-\frac 1N\sum_{i=1}^NY_i. $$ Since the random variable $\theta_N$ is $\mathcal F_\infty$-measurable, it follows that $$ \mathbb E\left[\theta\mid\mathcal F_\infty\right]=\theta_N-\mathbb E\left[\frac 1N\sum_{i=1}^NY_i\mid \mathcal F_\infty\right]=\theta +\frac 1N\sum_{i=1}^NY_i-\mathbb E\left[\frac 1N\sum_{i=1}^NY_i\mid\mathcal F_\infty\right] $$ hence $$ \mathbb E\left\lvert\mathbb E\left[\theta\mid\mathcal F_\infty\right]-\theta\right\rvert\leqslant\frac 2N\mathbb E\left\lvert \sum_{i=1}^NY_i\right\rvert. $$ Now we are in position to conclude, using the fact that $N^{-1}\mathbb E\left\lvert \sum_{i=1}^NY_i\right\rvert\to 0$.

In the context of the question, this can be shown by using truncation: for $R>0$, let $Y_{i,R}:=Y_i\mathbf 1\left\{\left\lvert Y_i\right\rvert\leqslant R\right\}-\mathbb E\left[Y_i\mathbf 1\left\{\left\lvert Y_i\right\rvert\leqslant R\right\}\right]$, $Y'_{i,R}:=Y_i-Y_{i,R}$. Control $\mathbb E\left\lvert \sum_{i=1}^NY_{i,R}\right\rvert$ by the $L_2$-norm and use independence to exploit orthogonality of increments. Moreover, $\mathbb E\left\lvert \sum_{i=1}^NY'_{i,R}\right\rvert\leqslant 2N\mathbb E\left[Y_1\mathbf 1\left\{\left\lvert Y_1\right\rvert\gt R\right\}\right]$.

0
On

I am actually not sure if I need the independence of $\theta$, but I still have a different solution. By the strong law of large numbers, $$ \frac{1}{n}S_n = \frac{1}{n}\sum_{k=1}^n Y_n = \frac{1}{n}\sum_{k=1}^n Z_n + \theta\longrightarrow \mathbb E[Z_1] + \theta$$ or $$\frac{1}{n}S_n - \mathbb E[Z_1] \longrightarrow \theta$$

almost surely. But each $S_n$ is $\mathcal F_\infty$-measurable so $\theta$ must be as the a.s. limit (at least we find a version of $\theta$ that is after modifying it on a set of measure $0$). But that means $\theta$ = $[\theta \mid \mathcal F_\infty]$ almost surely.