I've got a problem, with a solution, in my introduction to mathematical statistics book and I just don't get how they got there. There are follow up questions so I'd like to get insight at how they got to the answer.
The problem: Let $X_1,\ldots,X_n$ be independent random variables with the uniform distribution on the interval $[0,1]$. Determine the expectation and variance of $Y=\max(X_1,\ldots,X_n)$. Hint: Deduce the density of Y from the distriution function $P(Y \leq y)$ of $Y$, which can be determined using the distribution functions of $X_1, \ldots, X_n$.
The solution: $\operatorname E[Y]=\frac{n}{n+1}$, $\operatorname{var}[Y]=\frac{n}{n+2}+\left(\frac{n}{n+1}\right)^2$.
Now I know that a uniformly distributed random variable on $[0,1]$ has the following probability distribution, expectation and variance;
$F(x)=x$ for $x \in [0,1]$, $\operatorname E[X]=\frac{1}{2}$ and $\operatorname{var}[X]=\frac{1}{12}$.
Here is an alternative approach which doesn't require any knowledge of the Beta distribution.
Since
$$Y=\max\{X_1,\ldots,X_n\} \leq y \iff \forall j=1,\ldots,n: X_j \leq y$$
it follows from the independence of the random variables $(X_j)_{j \leq n}$ that
$$\mathbb{P}(Y \leq y) = \mathbb{P} \left( \bigcap_{j=1}^n \{X_j \leq y\} \right) = \prod_{j=1}^n \mathbb{P}(X_j \leq y) = y^n \tag{1}$$
for $y \in (0,1)$. Thus the distribution function $F$ of $Y$ satisfies
$$F(y) = \begin{cases} 0, & y \leq 0, \\ y^n, & y \in (0,1), \\ 1, & y \geq 1. \end{cases}$$
Approach 1: Differentiating $F$ we find that $Y$ has a density $p$ with respect to Lebesgue measure, $$p(y) = n y^{n-1} 1_{(0,1)}(y).$$ Thus $$\mathbb{E}(Y)= \int y p(y) \, dy =n \int_0^1 y^n \, dy = \frac{n}{n+1}$$ and $$\mathbb{E}(Y^2) = \int y^2 p(y) \, dy = n \int_0^1 y^{n+1} \, dy = \frac{n}{n+2}.$$ Thus, $$\begin{align*} \text{var}(Y) = \mathbb{E}(Y^2)-(\mathbb{E}(Y))^2 = \frac{n}{n+2} - \frac{n^2}{(n+1)^2} &= \frac{n(n+1)^2 - n^2 (n+2)}{(n+1)^2 (n+2)} \\ &= \frac{n}{(n+1)^2 (n+2)}. \end{align*}$$
Approach 2: For any non-negative random variable $Z$ it holds that
$$\mathbb{E}(Z) = \int_0^{\infty} \mathbb{P}(Z > z) \, dz = \int_0^{\infty} (1-\mathbb{P}(Z \leq z)) \, dz, \tag{2}$$
see e.g. this question for details. For $Z:=Y$ we obtain from $(1)$ that
$$\mathbb{E}(Y) = \int_0^{1} (1-y^n) \, dy= 1 - \frac{1}{n+1} = \frac{n}{n+1}.$$
Similarly,
$$\mathbb{E}(Y^2) \stackrel{(2)}{=} \int_0^{\infty} (1-\mathbb{P}(Y^2 \leq y)) \, dy = \int_0^{\infty} (1-\mathbb{P}(Y \leq \sqrt{y})) \, dy,$$
and so by $(1)$
$$\mathbb{E}(Y^2) = \int_0^1 (1- y^{n/2}) \, dy = 1- \frac{1}{\frac{n}{2}+1} = \frac{n}{n+2}.$$
As in the first approach, we thus find
$$\begin{align*} \text{var}(Y) = \mathbb{E}(Y^2)-(\mathbb{E}(Y))^2 = \frac{n}{n+2} - \frac{n^2}{(n+1)^2} = \frac{n}{(n+1)^2 (n+2)}. \end{align*}$$