Convergence of a series a.e

104 Views Asked by At

In the following exercise, they supposed that $(X_n)_n$ is a sequence of independent and identically distributed random variables.

If we supposed that the distribution is degenerate, then $\exists c \in \mathbb{R}$ such that $P(X_1=c)=1,$ so there exist a sequence of constants $(c_n)_n$ ($c_n=c-\frac{1}{2^n}$), the series become $\sum_n (X_n-c_n)=\sum_n\frac{1}{2^n},$ and the series converges a.e

On the other hand if we supposed that there exist a sequence of constants $(c_n)_n$ such that $\sum_n (X_n-c_n)$ converges a.e then $X_n-c_n$ converges a.e to $0,$ and so $\forall x \in \mathbb{R},\lim_n|e^{-ixc_n}\varphi_{X_n}(x)|=|\varphi_{X_1}(x)|=1,$ which means that the distribution of $X_1$ is degenerate.

So my question is, why did he suppose that $(X_n)_n$ is independent, and can we remove it? Because I didn't need it, unless if I did a mistake in the above solution.

(NOTE that this is exercise 23 page 184, from the book Course in probability theory) enter image description here

1

There are 1 best solutions below

1
On BEST ANSWER

You prove that if $(X_n)$ is a sequence of identically distributed random variables such that there exists a sequence $(c_n)$ for which $X_n-c_n\to 0$ in distribution, then $X_n$ (hence $X_1$) is degenerated. This is stronger than what was asked since

  • we do not need to assume independence;
  • the requirement that $X_n-c_n\to 0$ in distribution is weaker than the almost sure convergence of $\sum_n (X_n-c_n)$.