I have $n$ i.i.d. random variables, $X_1,..., X_n$ which follow some arbitrary distribution. Based on experiments in Python with various distributions, it seems that $\mathbb{E}(\max(X_1,...,X_n))$ is a linear (or seemingly close to linear) function of $\mathbb{E}(X_i)$. It is indeed linear for some examples where it is possible to get a closed form solution for $\mathbb{E}(\max(X_1,...,X_n))$ or a good approximation.
Expected value of $\max\{X_1,\ldots,X_n\}$ where $X_i$ are iid uniform.
Expectation of the maximum of i.i.d. geometric random variables
I wonder if this is the case more generally? Is there some way to prove it?
This is question is related to something called order statistics in probability theory. You can read more about them here. For $n$ iid variables $X_1, …, X_n$ with cumulative density function $F$ and density function $f$, the density function of the maximum is:
$$f_{max}(x) = nf(x)F(x)^{n-1}$$
Then this implies the expected value would be:
$$E[X_{max}] = \int_{-\infty}^{\infty} nxf(x)F(x)^{n-1} dx$$
I don't see any linear relationship here in general between $E[X_{max}]$ and $E[X]$