Weak convergence(i.e convergence in distribution) of first order statistics # problem 1.1(ch 6) of "Intermediate course in Probability" by Allan Gut

87 Views Asked by At

For each $n = 1, 2, ....$, suppose that $X_n$ is a continuous random variable with density $$\hspace{10mm}\mathrm{f}(x) = \begin{cases} \frac{1}{2}(1+x)e^{-x}, & \text{if $x \ge 0$ } \\[2ex] 0, & \text{if $x\le 0$} \end{cases}$$ Set $Y_n$=min{$X_1,X_2,....X_n$}. Does n$\cdot$$Y_n$ converges in distribution as n$\to \infty $ ? What will be the limiting distribution of n$\cdot$$Y_n$? Attept: I was tryng to find the distribution of n$\cdot$$Y_n$. But it became too complicated. How should I proceed here. Any help would be much appreciated.

1

There are 1 best solutions below

2
On

Let's try to find a distribution function of $nY_n$.\

$P(nY_n \leq t) = 1 - P(min \{ X_1, ..., X_n \} > \frac{t}{n}) = 1 - P(X_1 > \frac{t}{n})\cdot ... \cdot P(X_n > \frac{t}{n})$

We can calculate the tail of the random variable $X_1$ (and all of the variables because I suppose that they are i.i.d).

$P(X_1 > \frac{t}{n}) = \int_{\frac{t}{n}}^{\infty} \frac{1}{2}(1+x) \exp (-x) dx$.

Now, try to do this integral by parts.