Suppose $X_k$($1 \leqslant k\leqslant n$) be independent random variables with finite means.
I know the $\frac{1}{n}\sum_{k=1}^n E(X_k)=\frac{E(X_1)+ \cdots E(X_n) }{n} \leqslant E(X_1)+ \cdots E(X_n)$,
and $max(_1,···,_) \leqslant X_1+ \cdots + X_n \Longrightarrow E(max(_1,···,_)) \leqslant E(X_1+ \cdots + X_n)= E(X_1)+ \cdots E(X_n)$.
How comes $\frac{1}{n}\sum_{k=1}^n E(X_k) \leqslant E(max(X_1, · · · ,X_n))$?
$\quad E[max(X_1, · · · ,X_n)]$
$\quad \geq max\left(E[X_1],\ldots, E[X_n]\right)$
$\quad = \frac{1}{n}\left(n.max\left(E[X_1],\ldots, E[X_n]\right)\right)$
$\quad = \frac{1}{n}\sum\limits_{k=1}^{n}max\left(E[X_1],\ldots, E[X_n]\right)$
$\quad \geq \frac{1}{n}\sum\limits_{k=1}^{n}E[X_k]$, since $max\left(E[X_1],\ldots, E[X_n]\right) \geq E[X_k]$, $\forall{k}$