Proof of convergence in distribution of a given sequence of random variables

45 Views Asked by At

I'm finding difficulty in solving the following problem regarding convergence in distribution.


Let $\lim_{x \to \infty} x^{\alpha}\left[1-F(x)\right]=\lambda$ for some $\alpha>0$ and $\lambda>0$. Show that $$(\lambda n)^{-1/\alpha}X_{(n)} \stackrel{d}{\to} Y,$$ where the cdf of $Y$ is given by $$F_Y(y)=\exp\left\{-y^{-\alpha}\right\}, \,\text{ if } y>0,\, (0, \,\text{ otherwise})$$


I progressed in the following way : $$P((\lambda n)^{-1/\alpha}X_{(n)} \leq y)=P(X_{(n)} \leq y\,(\lambda n)^{1/\alpha})=\{P(X \leq y\,(\lambda n)^{1/\alpha})\}^n=\{F(y\,(\lambda n)^{1/\alpha})\}^n$$

I do not understand how I can make use of the condition $\lim_{x \to \infty} x^{\alpha}\left[1-F(x)\right]=\lambda$ here to get to the desired limit distribution. Replacing $\lambda$ by the limit is not giving me anything satisfactory. Any help would be much appreciated. Thank you.

1

There are 1 best solutions below

1
On BEST ANSWER

The condition with $x = y(\lambda n)^{1/\alpha}$ yields $$n [1 - F(y(\lambda n)^{1/\alpha})] \to y^{-\alpha}$$ as $n \to \infty$.

With $u := 1 - F(y(\lambda n)^{1/\alpha})$, the Taylor series $\log(1-u) = - u + O(u^2)$ as $u \to 0$ implies $$n \log F(y(\lambda n)^{1/\alpha}) \to -y^{-\alpha}.$$ Exponentiating both sides yields the claim.

[I've omitted the verification that the remainder term $n O(u^2)$ vanishes.]


The remainder term is $n [1 - F(y (\lambda n)^{1/\alpha})]^2$ and you want to show this vanishes as $n \to \infty$. Use the condition $\lim_{x \to \infty} x^\alpha [1 - F(y (\lambda n)^{1/\alpha})] = \lambda$ to help you compute that limit.