Let $X$ be a random variable with cumulative distribution function $F(x)$. Then how to rigorously prove the following two limit statements?
$\lim_{x \to - \infty} F(x) = 0$.
$\lim_{x \to + \infty} F(x) = 1$.
Let $X$ be a random variable with cumulative distribution function $F(x)$. Then how to rigorously prove the following two limit statements?
$\lim_{x \to - \infty} F(x) = 0$.
$\lim_{x \to + \infty} F(x) = 1$.
On
The second limit (2) just represents $\Pr(x\in\mathbb{R})$.
For the first limit (1), define $G(x) = 1 - F(-x)$ and note that $G$ is also a cumulative distribution function. Then using (2), we then have $$\lim_{x\rightarrow -\infty}F(x)= \lim_{y\rightarrow\infty}F(-y)=1-\lim_{y\rightarrow\infty}(1-F(-y))=1-\lim_{y\rightarrow\infty}G(y)=1-1=0$$
Remember this following result:
Theorem A probability $P$ on $(\Omega,\mathcal{A})$ and a converging sequence of sets $A_n \in \mathcal{A}$ with limit $A \in \mathcal{A}$. Then $\lim_{n \to \infty}P(A_n) = P(A)$
In particular you have that if $A_n \to \Omega \implies P(A_n) \to 1$ and $A_n \to \emptyset \implies P(A_n) \to 0$.
A CDF of a probability $P^X$ is defined by: $F(x) := P^X((-\infty,x])$
Remember also that $P^X(A) = P(X^{-1}(A))$ so it's clear that $P^X(\mathbb{R}) = P(\Omega) = 1$ and $P^X(\emptyset) = P(\emptyset) = 1$.
When $x \to \infty$ the sequence $(-\infty,x]$ converge to $\mathbb{R}$ and when $x \to -\infty$ the sequence $(-\infty,x]$ converge to $\emptyset$. So, remebering the previous result you have:
This result by the way is valid for all CDF of probability defined on real line and not only for those induced by a real random variable.
EDIT: These are main steps for the proof, if you need more details just ask!