Can the partial sums of independent random variables with no normalization converge in distribution to a constant?

660 Views Asked by At

If $\{X_n , n\ge 1 \}$ is a sequence of independent random variables and $X_n$ is nondegenerate for at least one $n\ge1$, can there exist a finite constant c such that $S_n = \sum_{j=1}^n X_j \stackrel{d}{\rightarrow}c$ ?

(Here we are not dividing $S_n$ by any sequence of normalizing constants.)

I am struggling to find an example where this can hold. Could someone please provide an appropriate example, or a proof that no such constant $c$ can exist?

1

There are 1 best solutions below

5
On BEST ANSWER

The answer is negative.

Note that convergence in distribution to a constant implies convergence in probability to that constant, so it suffices to prove we cannot have probability to a constant. Note that without loss of generality, we may suppose $c=0$ and that $X_1$ is non-degenerate. Set $Y=-\sum_{n=2}^\infty X_n$, where the limit is taken in probability.

Then the question is (more or less) equivalent to asking if $Y=X$, or if a sequence of random variables can converge in probability to a non-degenerate random variable independent from each of the summands.

Now, convergence in probability implies that a subsequence converges almost surely, so by grouping the $X_i$ we may suppose that the convergence to $Y$ is a.s. Then we have $Y=X$ a.s. for two independent random variables $Y$ and $X$ with $X$ non-constant, which is impossible.