Convergence in Probability in terms of elementary outcomes of an experiment

109 Views Asked by At

Convergence in Probability talks about two RVs, $X_n$ and $X$ , associated with an experiment -

$\lim_{n \rightarrow \infty} P\big(|X_n-X| \geq \epsilon \big)=0, \qquad \textrm{ for all }\epsilon>0.$

I'm quite not able to understand what $P( \vert X_n - X \vert > \epsilon)$ means in terms of elementary outcomes of the experiment.

I can understand that $P(X_n > a)$ means the sum of probabilities of all outcomes, $o_i$, such that $X_n(o_i)$ = $x_i$ and $x_i > a$

Similarly, an expression like $P(X > b)$.

But an expression like $P( \vert X_n - X \vert > \epsilon)$ involves 2 random variables. How do we explain this expression in terms of elementary outcomes of the experiment ?

1

There are 1 best solutions below

2
On BEST ANSWER

It is best to think of $X_n$ and $X$ as functions of members $\omega$ of a sample space $\Omega.$ I.e., $X : \Omega \rightarrow {\mathbb R}$ and $X_n:\Omega \rightarrow {\mathbb R}.$ $P$ is a probability measure on $\Omega$ and

$$ P(|X_n - X| > \epsilon) $$ is shorthand for

$$ P(\{ \omega: |X_n(\omega) - X(\omega)| > \epsilon \}), $$ i.e., the probability of the set of samples on which the random variables $X_n$ and $X$ differ by $\epsilon.$

And $X_n \rightarrow X$ in probability means that the probability of "the set of samples on which $X_n$ and $X$ differ" goes to $0$ with $n$.