A Law of Large Numbers for Conditional Expectations

1k Views Asked by At

Let $(\Omega,\mathcal F,P)$ be a probability space, and suppose that we are given, for each $\gamma \in[0,1]$, an iid sequence of real integrable random variables $\{X_n(\gamma)\}_{n=1}^\infty$. Let $Y$ be a random variable taking values in $[0,1]$ which is independent from $X_n(\gamma)$ for all $n,\gamma$.

How can I show that

$$\frac{1}{n}\sum_{k=1}^n X_k(Y)\to E[X_n(Y)| \sigma(Y)] \,\,\text{ as } \,\,n\to \infty$$

in probability?

EDIT: Assume the following additional condition: for all $\delta>0$ we have

$$\sup_{\gamma\in[0,1]} P\bigg(\bigg|\frac{1}{n}\sum_{k=1}^n \Big(X_k(\gamma)-E[X_k(\gamma)] \Big) \bigg|>\delta\bigg)\to 0 \,\,\text{ as } \,\,n\to \infty.$$

2

There are 2 best solutions below

7
On BEST ANSWER

If $\{Z_n\}$ are i.i.d. and $X_n=f(Z_n,Y)$ for some Borel function $f$ and a random variable $Y$ which is independent of $\{Z_n\}$, then $\{X_n\}$ are conditionally i.i.d. given $\sigma(Y)$, and
$$ \frac{1}{n}\sum_{i=1}^n X_n\to \varphi(Y) \quad\text{a.s.}, $$ where $\varphi(y):=\mathsf{E}[f(Z_1,y)]$ (assuming that the latter expectation exists). (See, e.g., Theorem 4.2 in this paper.)


As for the updated version of the question, since $\{Z_n\}$ is independent of $Y$ (any sequence $\{Z_n\}$ works here), for any nonnegative Borel function $g_n$, \begin{align} \mathsf{E}[g_n(Z_1,\ldots,Z_n,Y)\mid Y]=\varphi_n(Y), \end{align} where $$ \varphi_n(y)=\mathsf{E}[g_n(Z_1,\ldots,Z_n,y)]\le \sup_{y\in [0,1]}\mathsf{E}[g_n(Z_1,\ldots,Z_n,y)]. $$

7
On

Because of the assumed independence of $\{X_n(\gamma): n\ge 1, \gamma\in[0,1]\}$ and $Y$, you can assume that $(\Omega,\mathcal F,P)$ is the product of $(\Omega_1,\mathcal F_1,P_1)$ and $(\Omega_2,\mathcal F_2,P_2)$, with the $X_n(\gamma)$ depending only on $\omega_1\in\Omega_1$ and $Y$ depending only on $\omega_2\in\Omega_2$. By the standard SLLN, for each $\gamma\in[0,1]$, $$ n^{-1}\lim_n\sum_{i=1}^n X_i(\omega_1,\gamma)=g(\gamma):=E[X_1(\gamma)], $$ for $P_1$-a.e. $\omega_1$. Consequently, by Fubini's theorem, $$ n^{-1}\lim_n\sum_{i=1}^n X_i(\omega_1,Y(\omega_2))=g(Y(\omega_2)), $$ for $P_1\otimes P_2$-a.e. $(\omega_1,\omega_2)$, provided $\gamma\mapsto E[X_1(\gamma)]$ is measurable. If $(\omega_1,\omega_2)\mapsto X_1(Y(\omega_2))$ is $P_1\otimes P_2$-integrable, then the conditional expectation $E[X_1(Y)\mid\sigma(Y)]$ exists and coincides with $g(Y)$.