Question: Fix a real number $\beta \geq 1$ and an integer $n \geq 1$. Let $X_1, \cdots, X_n$ be independent samples from Beta$(\beta,\beta)$. What is $$f(n,\beta) := \mathbb{E}\left[ \max\{X_1,\cdots,X_n\} \right]~~~?$$
First some basic facts: Clearly $1/2 = \mathbb{E}[X_1] \leq f(n,\beta) \leq 1$. Also $f(n,1)=n/(n+1)$, as Beta$(1,1)$ is the uniform distribution. Moreover, $f(n,\beta)$ is an increasing function of $n$ and a decreasing function of $\beta$ with $\lim_{n\to\infty}f(n,\beta)=1$ for all $\beta$ and $\lim_{\beta\to\infty}f(n,\beta)=1/2$ for all $n$.
I suspect that $f(n,\beta)$ does not have a clean expression. I would he happy with reasonably tight bounds on it.
I'm particularly interested in the parameter regime where increasing $n$ and $\beta$ ``cancel out.'' Namely, I want $f(n,\beta)=1/2+\alpha$ to be fixed, for some constant $\alpha \in (0,1/4)$. Given $n$, what value of $\beta$ achieves this? I believe $\beta \approx \log(n)/\alpha^2$ is the correct answer.
If anyone knows how to approach this problem to get clean bounds, I would greatly appreciate their suggestions.
Here is an upper bound that I do have: By Jensen's inequality, for all $t>0$, $$e^{t\mathbb{E}\left[\max\{ X_1, \cdots, X_n \}-1/2\right]} \leq \mathbb{E}\left[ e^{t \cdot \left(\max\{ X_1, \cdots, X_n \}-1/2\right)} \right] \leq \sum_{i=1}^n \mathbb{E}\left[ e^{t(X_i-1/2)} \right].$$ Since the beta distribution is subgaussian, $\mathbb{E}\left[ e^{t(X_i-1/2)} \right] \leq e^{t^2/(16\beta+4)}$. Thus $$f(n,\beta) = \mathbb{E}[\max\{ X_1, \cdots, X_n \}] \leq \frac12 + \inf_{t>0} ~ \frac{1}{t} \log \left( n \cdot e^{t^2/(16\beta+4)} \right) = \frac12 + \sqrt{\frac{\log n}{4 \beta + 1}}.$$ I believe this upper bound is fairly tight for the parameter regime of interest. (Although, if $\beta+1/4\leq \log n$, this is obviously not tight.) I would like to see a matching lower bound.