Let $X_1, X_2, \ldots, X_n$ be independent random variables having the common density function $f(x)$. We have $$f(x) = \begin{cases} 1 & \text{for } 0 < x < 1, \\ 0 & \text{otherwise} \end{cases}$$ I want to find $E[\max(X_1,X_2,\ldots,X_n)]$.
I found this problem in Sheldon's book Introduction to Probability and Statistics for Engineers and Scientists and I am not able to start the problem. (I’ve already spent two hours trying.)
Please can I have some suggestion about how to approach the problem.
Hint: Useful facts to know:
(1) If events A and B are independent, then $P(A\cap B)=P(A)P(B)$
(2) $E(Y)=\int_{-\infty}^{\infty}yf_{y}(y)dy$
(3) If $Y$ has pdf $f_{y}(y)=1$ for $y\in(0,1)$ and $0$ otherwise, then $Y\sim Unif(0,1)$
(4) letting $F(y)$ be CDF and $f_{y}(y)$ be pdf, $\frac{d}{dy}F(y)=f_{y}(y)$
With these facts we see to find expected value we must know pdf of $max(X_{1},...,X_{n})$ (#2). Thus to derive this it is easiest to look at CDF since (#4) thus you want to find,
$$F_{max(X_{1}...,X_{n})}(t)=P(max(X_{1}...,X_{n})<t)$$
Well, if max is less than t what does that mean about each $X_{i}$. When you figure this out use (#1), (#3), and (#4) to derive pdf with pdf use (#2) to find expected value