Scaling sequences from i.i.d random variables

22 Views Asked by At

Consider i.i.d. random variables $(X_i)$ drawn from a uniform distribution on $[0, 1].$

In the following find scaling sequences $a_n$, $b_n$ such that $a_n(M_n − b_n)$ converges in distribution to a non-trivial limit function $G$.

(a) $Y_i = X_i,$ and $M_n = \max(Y_1, . . . , Y_n);$

(b) $U_i = \frac{1}{X_i}$, and $M_n = \max(U_1, . . . , U_n)$;

For each case find first of all the probability distribution function $P(M_n ≤ u/a_n + b_n)$ as a function of $u_n = u/a_n + b_n$. After I would like to find suitable scaling sequences $a_n$, $b_n$ so that you get a non-trivial limit $G(u)$ as $n → ∞$.

I'd be very grateful and would really appreciate some help with this.