I have the following exercise, but I'm a bit lost, I'll share it with you:
Let $(U_{n})_{n\in\mathbb{N}}$ be a sequence of random variables i.i.d. with uniform distribution in the interval $[0, 1]$. We define for each $n\in \mathbb{N}$ the random variables: $$M_{n} = \max\{U_{1} ,...,U_{n}\}$$ $$m_{n} = \min\{U_{1} ,...,U_{n}\}$$
a) Prove that $M_{n} - m_{n}$ has distribution $\beta (n - 1, 2)$
b) Show that $M_{n} -m_{n}\ \underrightarrow{d}\ 1$
c) Calculate the limit in distribution of the sequence $(n (1 - M_{n} + m_{n})) \ n \in \mathbb{N}$. Is the limit a distribution known?
Now I have to:
$$P(\min\{U_{1} ,...,U_{n}\}>a)=P(U_{1}>a)P(U_{2}>a)\dotsi P(U_{n}>a)=(1-a)^n$$ $$\rightarrow F_{m_{n}}(a)=1-(1-a)^n \rightarrow f_{m_{n}}(a)=n(1-a)^{n-1}$$
$$P(\max\{U_{1} ,...,U_{n}\}\leq a)=P(U_{1} \leq a)P(U_{2} \leq a)\dotsi P(U_{n} \leq a)=a^n$$ $$\rightarrow f_{m_{n}}(a)=na^{n-1}$$
I don't know how to continue, could you help me?
First, calculate the joint pdf of $m_n, M_n$. For any $0 \leq m \leq M \leq 1$, if you have $m < m_n, M_n \leq M$, we must have $m < U_i \leq M, \forall i = 1, \ldots, n$. Thus by independence $$ P(m < m_n, M_n \leq M) = \prod_{i = 1}^n P(m \leq U_i \leq M) = (M - m)^n $$ Also, as you already noted yourself, $P(M_n \leq M) = M^n$. So now, the complement with respect to the first component (i.e. keeping $M_n \leq M$ fixed) is: $$ P(m_n \leq m, M_n \leq M) = P(M_n \leq M) - P(m < m_n, M_n \leq M) = M^n - (M - m)^n $$ Taking the partial derivatives w.r.t. $m$ and then $M$ gives the density of $(m_n, M_n)$ over $0 \leq m \leq M \leq 1$: $$ f_{m_n, M_n}(m, M) = n(n - 1)(M - m)^{n - 2},\ 0 \leq m \leq M \leq 1 $$ It is also not hard to see that other combinations of $m, M$ must yield $0$ for the density. So $$ f_{m_n, M_n}(m, M) = \begin{cases} n(n - 1)(M - m)^{n - 2}\ &0 \leq m \leq M \leq 1 \\ 0 &\text{elsewhere} \end{cases} $$ Now we do the transformation $T : (m, M) \mapsto (m, M - m) := (\mu, \delta)$. This maps the support $$\{(m, M) \in \mathbb{R}^2:0 \leq m \leq M \leq 1\}$$ of $(m_n, M_n)$ to the support $$\{(\mu, \delta) \in \mathbb{R}^2:0 \leq \mu \leq 1 - \delta,\ 0 \leq \delta \leq 1\}$$ of $(m_n, M_n - m_n)$. Also, the inverse transformation is $T^{-1}: (\mu, \delta) \mapsto (\mu, \mu + \delta)$ with Jacobian $J(\mu, \delta) = \begin{bmatrix} 1 &0 \\ 1 &1 \end{bmatrix}$. Thus the transformed pdf is \begin{gather*} f_{m_n, M_n - m_n}(\mu, \delta) \\ = \begin{cases} f_{m_n, M_n}(T^{-1}(\mu, \delta))|J(\mu, \delta)| = n(n - 1)\delta^{n - 2}\ &0 \leq \mu \leq 1 - \delta,\ 0 \leq \delta \leq 1 \\ 0 &\text{elsewhere} \end{cases} \end{gather*} So finally the (marginal) density of $M_n - m_n$ is: $$ f_{M_n - m_n}(\delta) = \int_0^{1 - \delta}n(n - 1)\delta^{n - 2}\ d\mu = n(n - 1)(1 - \delta)\delta^{n - 2},\ \text{ for } 0 \leq \delta \leq 1 $$ and $0$ elsewhere. This is precisely the pdf of $\beta(n - 1, 2)$.