Proving cumulative distribution function is right-continuous and have limits 0 and 1

734 Views Asked by At

Suppose we require the following for a probability measure

1) $0 \leq P(A) \leq 1$

2) $\mathbb P\left(\displaystyle\bigcup_{k=1}^\infty A_k\right) = \displaystyle\sum_{k=1}^\infty \mathbb P(A_k)$

3) $\mathbb P(\Omega) = 1$

for an arbitrary event $A$ and an arbitrary sequencce $\left\{A_k\right\}_{k=1}^\infty$ of pairwise disjoint sets, i.e. $\left\{A_k\right\}_{k=1}^\infty$ satisfies $A_i \cap A_j = \varnothing \; \forall i \neq j$.

Let's now define the function $F(x) := \mathbb P(\mathbb X \leq x)$, which we know exists by assumption. Now, proving it's increasing (non-decreasing) is easy since for every $\varepsilon > 0$ and for every fixed $x_0 \in \mathbb R$

$F(x_0 + \varepsilon) = \mathbb P(\mathbb X \leq x_0 + \varepsilon) = \mathbb P(\mathbb X \leq x_0) + \mathbb P(x_0 \leq \mathbb X \leq \varepsilon) = F(x_0) + \mathbb P(x_0 \leq \mathbb X \leq \varepsilon) \geq F(x_0)$

But right-continuity and the limits $\begin{cases}\displaystyle\lim_{x\to-\infty} F(x) = 0 \\ \displaystyle\lim_{x\to\infty} F(x) = 1\end{cases}$ seem much harder to prove. It's easy to see that the limit $\displaystyle\lim_{\varepsilon \to 0_+} F(x_0 + \varepsilon) = F(x_0)$ is equivalent to $\displaystyle\lim_{\varepsilon \to 0_+} P(x_0 \leq \mathbb X \leq x_0 + \varepsilon) = 0$, but I am not getting any further than that.

I'm assuming I want to use axiom (2) somehow – my original idea was to use (2) together by (1) to show that you can "subtract" infinitely many intervals from $[x_0, x_0+\varepsilon]$ (thus making $\varepsilon$ smaller and smaller), and ultimately reach the limit "at $\varepsilon = 0$" since $ \mathbb P(\varnothing) = 0$ – but I'm not sure I can do that.

After all, assuming an uniform distribution, I can remove infinitely many intervals without reaching the limit by using the fact $\displaystyle\sum_{k=1}^\infty \frac{1}{n^2} = \frac{\pi^2}6$.

1

There are 1 best solutions below

1
On

You will need the intermediate lemma

$$\mathbb{P} \left ( \bigcap_{n=1}^\infty A_n \right ) = \lim_{n \to \infty} \mathbb{P}(A_n)$$

for a sequence of events $A_n$ such that $A_{n+1} \subset A_n$. This requires both axiom $2$ and one of the others to prove (importantly it does not hold for infinite measures despite the fact that they satisfy axiom 2).

That lemma will get you the right-continuity and the limit at $-\infty$. The limit at $+\infty$ is easier: $\mathbb{P} \left ( \bigcup_{n=1}^\infty \mathbb{P}(\mathbb{X} \leq n) \right ) = \mathbb{P}(\Omega)$ for a finite valued random variable.