I am struggling with intuitively understanding convergence in probability, and I think my difficulty lies in understanding the notation of the definition. I was unable to find other questions/answers that address the notation specifically.
From the following definition:
Let $\{{X_n}\}_{n\geq1}$ be a sequence of RVs. We say that $X_n$ converges in probability to an RV $X$ if:
$\lim_{n\to\infty} P(|X_n - X|<\epsilon)=1$ for every $\epsilon>0$
In this definition, what exactly does $X_n$ refer to? Is it the whole sequence (i.e. the originally defined $\{X_n\}_{n\geq1}$), the $\text{n}^{\text{th}}$ term of the sequence, or something else?
If it is referring to the $\text{n}^{\text{th}}$ term, if each $X_i$ in the sequence is identically distributed, what is special about $X_n$ compared to say $X_3$?
If it is referring to the whole sequence, how can you "subtract" from a whole sequence? For example, saying $X_n - X = X_1 - X, X_2 - X, X_3 - X,...$ is still a sequence.
I understand how sample mean $\frac{1}{n}\sum X_i$ can converge to a constant, but the $X_i$ terms here are not a sequence, but rather a sum of the terms in a sequence.
This seems simple but I am having trouble with wrapping my head around this. Thanks!
$X_n$ appears in multiple places in this definition.