Limit of sum of bounded iid random variables

350 Views Asked by At

Let $(V^i_N)_{i\ge 1}$ be a sequence of i.i.d. real random variables such that $|V^i_N|\le c$ for some $c$ independent of $N$. Suppose that each $V^i_N$ converges in law to some random variable $V^i$ with $E[V^i]=m$. Then

$ \lim\limits_{N\to \infty} \frac 1N \sum\limits_{i=1}^N V^i_N=m $ in law.

It seems like it should be solved by the law of large numbers, but they don't have the same mean, so I don't know how to use it here.

Thanks for your answers!

2

There are 2 best solutions below

2
On BEST ANSWER

Setup

For each $N \in \{1, 2 ,3 ,...\}$ let $\{V_N^i\}_{i=1}^{\infty}$ be a sequence of i.i.d. random variables. The variables across different sequences (for different $N$) need not be independent nor identically distributed. Assume there is a finite constant $c$ such that with prob 1: $$ |V_N^i|\leq c \quad \forall N, i \in \{1, 2, 3, ...\}$$ Suppose that there are additional random variables $\{V^i\}_{i=1}^{\infty}$ such that $E[V^i]=m$ for each $i \in \{1, 2, 3,...\}$ (for some $m \in \mathbb{R}$) and $V_N^i$ converges in distribution to $V^i$ as $N\rightarrow\infty$. That is, for all points $t \in \mathbb{R}$ at which the CDF of $V^i$ is continuous we have: $$ \lim_{N\rightarrow\infty} P[V_N^i>t] = P[V^i>t]$$

Define \begin{align} m_N &= E[V_N^1] \quad \forall N \in \{1, 2, 3, ...\}\\ A_N &= \frac{1}{N}\sum_{i=1}^N V_N^i\quad \forall N \in \{1, 2, 3, ...\} \end{align}

Observe that $m_N = E[V_N^i]$ for all $i, N \in \{1, 2, 3, ...\}$.

Claim 1 (Convergence of the means):

$$\lim_{N\rightarrow\infty} m_N = m$$

Claim 2 (Convergence in probability):

For all $\epsilon>0$ we have $\lim_{N\rightarrow\infty} P[|A_N-m|\geq \epsilon] = 0$.

Proof steps:

  • Assume Claim 1 holds. To prove Claim 2 you can fix $\epsilon>0$, argue that for all sufficiently large $N$ we have $P[|A_N-m|\geq \epsilon] \leq P[|A_N-m_N|\geq \epsilon/2]$, and use the Markov/Chebyshev inequality.

  • To prove Claim 1 you can use $V_N^i+c \in [0, 2c]$ with prob 1 and $E[V_N^i+c] = \int_0^{\infty} P[V_N^i+c>t] dt$.

3
On

@Alex:

1) convergence in Law to a certain constant means "Convergence in Probability"

2) if the rv's are iid they HAVE the same mean, same variance and same distribution so the SLLN can be applied and your sequence converges in Law, in Probability and a.s.

3)note that, under some conditions, the SLLN holds also with independent rv's but not identically distributed (Kolmogorov SLLN)