Why doesn't the Law of Large Numbers hold here?

192 Views Asked by At

This is from an old homework assignment of mine, which I've since turned in.

Say you have an independent sequence of R.V.s such that $\mathbb{P}(X_n= 2^n) = \frac{1}{2^n} = 1 - \mathbb{P}(X_n = 0)$. Show that: $$\mathbb{E}[X_n] = 1$$ and moreover: $$\frac{S_N}{N} \to 0$$ almost surely. Why doesn't this contradict the strong LLN?

Edit: So it seems the variables are not identically distributed. How would I prove this (is it obvious/trivial)? Also I was unable to compute the expectation so assistance would that would be appreciated.

3

There are 3 best solutions below

4
On BEST ANSWER

The variables are not identical since the distribution is different for each $n.$ Both the probabilities and the possible values depend on $n.$ The expectation value can be calculated by the usual formula for discrete variables. $$E(X_n) = \frac{1}{2^n} 2^n + (1-2^{-n})0=1.$$

6
On

Proving $S_N/N\to 0$ can be done using the first Borel-Cantelli lemma. Since $\sum_n P(X_n\neq 0)=\sum (1/2)^n<\infty$, this Lemma implies that with probability one, there will be only finitely many $n$ for which $X_n \neq 0$. This means that the sequence $X_1,X_2,\dots$ is almost surely eventually constant $0$, which implies $S_N/N$ converges to $0$.

0
On

The hardest part of the question seems to be showing $$ \frac{S_N}{N} \to 0 $$ almost surely, where $S_N = X_1 + X_2 + \cdots + X_N$. For this, consider the probability that all samples starting with $X_K$ are $0$: $X_K = X_{K+1} = X_{K+2} = \cdots = 0$. This is given by \begin{align*} \left(1 - \frac{1}{2^K}\right) \left(1 - \frac{1}{2^{K+1}}\right) \left(1 - \frac{1}{2^{K+2}}\right) \cdots &= \prod_{n=K}^\infty \left( 1 - \frac{1}{2^n}\right). \\ \end{align*} Now, the infinite product $\displaystyle \prod_{n=1}^\infty \left( 1 - q^n\right)$ converges for all complex $|q| < 1$, to the Euler function $\phi(q)$. For $q = \frac12$, it happens to be about $0.28878\ldots$ (see OEIS). Regardless of the numerical value, since it is positive (i.e. not zero), it follows that for all $\varepsilon > 0$, there exists a large enough $K$ such that the probability of $X_K = X_{K+1} = \cdots = 0$ is at least $1 - \varepsilon$.

In the event that $X_K = X_{K+1} = \cdots = 0$, we further deduce that for all $N \ge K$, $$ S_N = X_1 + X_2 + \cdots + X_N = X_1 + X_2 + \cdots + X_K \le 2^1 + 2^2 + \cdots + 2^K < 2^{K+1}. $$

So $S_N$ is uniformly bounded; thus with probability $1 - \varepsilon$, $\frac{S_N}{N}$ converges to $0$. Since this holds for all $\varepsilon > 0$, the probability that $\frac{S_N}{N}$ converges must in fact be $1$. By the definition of almost-sure convergence, we are done.