the meaning of $X_n$ in convergence in distribution definition

725 Views Asked by At

I really have trouble understanding the definition of convergence in distribution. Here's some of my confusion.

1). What does it mean by saying "$\{X_n\}$ is a sequence of random variables". Does it mean there exist a sample of size n that consists of $n$ random variables $X_1, X_2, \cdots, X_{n-1}, X_n$?

2). In the convergence in distribution definition, when we say $X_n$ converges in distribution to $X$, do we mean that the limit of the CDF of the $n$-th random variable goes to the CDF of $X$ when $n$ goes to infinity or do we mean that the CDF of the $n$ random variables goes to the CDF of $X$ when $n$ goes to infinity?

3). Do those random variables have the same distribution since they are from the same population?

4). Does convergence in distribution only happen when the sample size n is large?

5). One example I was given is $X_n = \dfrac{1}{n} \sim F_n$. Does this mean the first rv $X_1 = 1$, the second rv $X_2 = \dfrac{1}{2}$, and the third rv $X_3 = \dfrac{1}{3}$? Then what is $F_n$ when $n = 1, 2, 3, \cdots$?

1

There are 1 best solutions below

4
On

$ \{ X_n \}$ is a whole sequence of random variables $X_1,X_2,\dots$ continuing forever. (This meaning of the word "sequence" is the usual meaning in mathematics.)

It is the first one: the CDF $F_n$ of $X_n$ goes to the CDF $F$ of $X$.

In general $\{ X_n \}$ are just abstract random variables, there is no assumption that they're from the same distribution or any such thing.

Again, there's not really "sampling" going on in general, so this question doesn't really make sense. That said, there are cases where we talk about convergence in distribution when the random variables are related to a large sample, such as the central limit theorem.

If $X_n$ is just the constant variable $1/n$ then $F_n(x)=\begin{cases} 0 & x<1/n \\ 1 & x \geq 1/n \end{cases}.$