Convergence in Probability of i.i.d exponential random variables

2k Views Asked by At

I'm a bit lost here, the question goes as follows:

Suppose that $X_k$ are i.i.d. and follow an exponential distribution with parameter $\lambda$.

Define $F_n(x) := \frac{1}{n}\sum_{i=1}^n(X_{k}\leq x) $ for x $\geq0$

Question :Show that $F_n(x)$ converges in probability to $ 1−e ^{−\lambda x} $ Does it also converge in $L^{1}$ norm?

I think to show that $F_n(x)$ converges in probability we need to show that:

$\lim_{n\to\infty}P\left(\left|F_n(x)-F(x)\right|>\varepsilon\right)=0$

Therefore we can state that:

$F_n(x)$$\overset{p}{\to} 1−e ^{−\lambda x} $

if

$$\lim_{n\to\infty}P \left(\left| \frac{1}{n}\sum_{i=1}^n\mathbb{1}(X_{k}\leq x)- \left(1-e^{-\lambda x}\right)\right|>\varepsilon\right)=0.$$

But here is where I get stuck in terms of figuring out the correct way. Could I use the CLT?
I was under the impression that the CLT is for convergence in distribution whereas convergence in probability is stronger (except for the case where it converges to a constant which is not the case here.)

Alternatively I thought about using The weak law of large numbers and Chebychev,but then I still need to prove $L^p$

I already know that convergence in probability will not mean convergence in $L^p$ but again how would I show this?

Kind regards

2

There are 2 best solutions below

0
On

If you know the weak law of large numbers, then it is immediate.

Otherwise, we can argue directly. We can prove that $$ \lim_{n \to + \infty}\mathbb E \left[ \left(F_n(x)-F(x)\right)^2\right]=0,$$ where $F(x)=1-e^{ \lambda x}$. Indeed, note that $$\left(F_n(x)-F(x)\right)^2 = \frac 1 {n^2 } \sum_{j=1}^n \left(\mathbf 1 \left\{X_j \leqslant x \right\} -F(x)\right)^2+ \frac 2{n^2 } \sum_{1 \leqslant i \lt j \leqslant n }\left(\mathbf 1 \left\{X_i \leqslant x \right\}-F(x) \right)\left(\mathbf 1 \left\{X_j \leqslant x \right\} -F(x)\right).$$ The expectation of the second sum term is $0$ because the random variables $\left(\mathbf 1 \left\{X_i \leqslant x \right\}-F(x) \right)$ and $\left(\mathbf 1 \left\{X_j \leqslant x \right\}-F(x) \right)$ are centered and independent. Since $X_j$ has the same distribution as $X_1$, $\left(\mathbf 1 \left\{X_j \leqslant x \right\}-F(x) \right)^2$ has the same distribution as $\left(\mathbf 1 \left\{X_1\leqslant x \right\}-F(x) \right)^2$. In particular, these random variables share the same expectation. We thus have $$ \mathbb E \left[\left(F_n(x)-F(x)\right)^2 \right] = \frac 1 {n } \mathbb E\left[\left(\mathbf 1 \left\{X_1 \leqslant x \right\} -F(x)\right)^2\right],$$ which shows that $F_n(x)\to F(x)$ in $ \mathbb L^2$ for any $x \in\mathbb R$.

0
On

As stated above to state convergence in probability we can simply apply the law of large numbers using Chebychev's inequality:

$P(| \frac{1}{n}\sum_{i=1}^n\mathbb{1}(X_{k}\leq x)- (1-e^{-\lambda x})|>\varepsilon)≤\frac{\sigma^2 (F_n(x))}{\varepsilon^2}$

where the second term reduces to: $\frac{1}{n\lambda^2\varepsilon^2} \rightarrow0 $ for $\lim_{n\to\infty}$