Strong law of large numbers for function of random vector: can we apply it for a component only?

823 Views Asked by At

Consider

  • i.i.d. random variables $\{X_1,..., X_n\}$ with well defined first moment

  • i.i.d. random variables $\{Y_1,..., Y_n\}$ with well defined first moment

By the strong law of large numbers: $$ \frac{1}{n}\sum_{i=1}^n Y_i \rightarrow_{a.s.} E(Y_i) \text{ }\text{ as $n\rightarrow \infty$} $$


Consider these three objects for any function $g: \mathbb{R}^2\rightarrow \mathbb{R}$ (take $Y_i$ discrete with support $\mathcal{Y}$ for simplicity)

1) for a given realisation $x$ of $X_k$, $E(g(X_k, Y_i)| X_k=x)\equiv \sum_{y\in \mathcal{Y}} g(x, y)P(Y_i=y|X_k=x)$ which is a scalar

2) $E(g(X_k, Y_i)| X_k)\equiv \sum_{y\in \mathcal{Y}} g(X_k, y)P(Y_i=y|X_k)$ which is a random variable because $g(X_k,y)$ and $P(Y_i=y|X_k)$ are both functions of the random variable $X_k$

3) $F(X_k)\equiv\sum_{y\in \mathcal{Y}}g(X_k, y)\mathbb{P}(Y_i=y)$ which is a random variable because $g(X_k, y)$ is a function of $X_k$.

When $X_k\perp Y_i$, then $(2)=(3)$.

Question: 1) Is it true that $\forall k=1,...,n$ $$ \frac{1}{n}\sum_{i=1}^n g(X_k, Y_i) \rightarrow_{a.s.} F(X_k) \text{ }\text{ as $n\rightarrow \infty$} $$ If yes, under which conditions?

2) Is it true that $\forall k=1,...,n$ $$ \frac{1}{n}\sum_{i=1}^n g(X_k, Y_i) \rightarrow_{a.s.} E(g(X_k, Y_i)|X_k) \text{ }\text{ as $n\rightarrow \infty$} $$ If yes, under which conditions?


EDIT: This question here is close to mine and includes also an answer. However it is for $g(X_k, Y_i)=Y_i\times X_k$

2

There are 2 best solutions below

1
On BEST ANSWER

Your first question is deeper. Here I show the answer is "yes" under some general conditions, though I expect the answer is "yes" even for more general conditions than I give.

Your question boils down to this: Let $\{Y_i\}_{i=1}^{\infty}$ be i.i.d. and let $X$ be a random variable that possibly depends on $\{Y_i\}_{i=1}^{\infty}$. Define $f:\mathbb{R}\rightarrow\mathbb{R}$ by: $$ f(x) = E[g(x,Y_1)]$$ and assume $f(x)$ is finite for all $x \in \mathbb{R}$. We want to know if the following is true: $$ \lim_{n\rightarrow\infty} \frac{1}{n} \sum_{i=1}^ng(X, Y_i) = f(X) \quad \mbox{with prob 1} $$

Why the question is interesting

The random variable $X$ might have information about one (or even all) of the $Y_i$ variables. For example, suppose $\{Y_i\}_{i=1}^{\infty}$ are i.i.d. binary equally likely, and $X = \sum_{i=1}^{\infty} Y_i 2^{-i}$. Then $X$ is uniform over $[0,1]$ and from $X$ we can take a binary decimal expansion to reconstruct each $Y_i$ (assuming the probability 0 event of $\{Y_i\}$ having an infinite tail of 1's does not occur).

Proof of "yes" when $X$ takes values in a countable set

Suppose $X$ takes values in a finite or countably infinite set $\mathcal{X}$. For each $x \in \mathcal{X}$ define the event $$ A_x = \left\{ \lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^n g(x,Y_i) = f(x) \right\} $$ with $f(x)$ defined by $f(x)=E[g(x,Y_1)]$ for each $x \in \mathcal{X}$. Note that $P[A_x]=1$ for all $x \in \mathcal{X}$ (by the Law of Large Numbers). Define $$ A = \left\{ \lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^n g(X,Y_i) = f(X)\right\}$$ We want to show that $P[A]=1$. We have \begin{align} P[A] &= \sum_{x\in \mathcal{X}}P[A \cap \{X=x\}] \\ &\overset{(a)}{=} \sum_{x \in \mathcal{X}} P[A_x \cap \{X=x\}] \\ &\overset{(b)}{=} \sum_{x \in \mathcal{X}} P[X=x] \\ &= 1 \end{align} where (a) uses the fact that for each $x \in \mathcal{X}$ we have $$\{A \cap \{X=x\}\} = \{A_x \cap \{X=x\}\}$$ and (b) uses the fact that $P[A_x]=1$ and so $P[A_x\cap \{X=x\}] = P[X=x]$.$\Box$

3
On

Yes this is necessary take the function $g(x,y)=xy$, if instead of being independent $X,Y$ are completely dependent ie $X=Y$ we have that on the one hand $$\frac{1}{n}\sum_{i=1}^n g(X_1, Y_i)=X_1 E(X_1) \rightarrow_{a.s.} \text{ }\text{ as $n\rightarrow \infty$}$$ and on the other hand $$E_{Y}(g(X_1, Y_i))=E(X_1)^2 i\geq 2 .$$