Quantitive Central Limit Theorem

248 Views Asked by At

Suppose $\{ X_i \}_{i=1}^{\infty}$ is an infinite sequence of i.i.d random variables, with mean 0 and variance 1.

According to the CLT, the cdf ("commulative distribution function") of $S_n = \frac{X_1 + \cdots + X_n}{\sqrt{n}}$ converges to the cdf of a normal random variable with mean 0 and variance 1. This convergence is called "Convergence in Distribution".

I want to understand for which continuous functions $f$ we have: $$Ef(S_n) \to Ef(Normal(0,1))$$

  1. What about polynomials? (In other words, does convergence in distribution implies convergence of moments?)
  2. What about power series?
  3. What about $f(x)=|x|$?
  4. Does the answer to one of the last 3 questions changes if $X_i$ takes the 2 values -1,1 with equal probability? In particular, what about the 3rd question applied to a normalized random walk? Does $E|\frac{2Bin(n,0.5)-1}{\sqrt{n}}| \to \frac{1}{\sqrt{2 \pi}} |x|e^{-x^2/2}dx$?
  5. What can be said about the speed of the convergence?
1

There are 1 best solutions below

1
On

According to alternative def of convergence in distribution, We say $X_n \to X$ in distribution iff

$$E[f(X_n)] \to E[f(X)]$$

for f bounded and continuous. With this in mind, 1) Polynomials are out unless bounded. 2) A power series is merely an "infinite" polynomial. So not necessarily unless bounded.

3) Again not necessarily true.

4) Same as before. It depends on f.

5) I know a result called Law of iterated Logarithms which might throw some light on this.

Hope the above give you an idea on solving your question. Note that I used the phrase "Not necessarily true". In the sense there might be some unbounded functions for which this is true. e.g f(x) = x in your case.