Let $\alpha_1,\dots,\alpha_k > 0$, and $1\leq s\leq k$ be an integer. Suppose $S\subseteq[k]$ is a random subset chosen uniformly among all subsets of $[k]$ of size $s$. Is there anything known about the quantity $$ \mathbb{E}_S \frac{\max_{i\in S} \alpha_i}{\sum_{i\in S}\alpha_i} $$ as a function of $s$ and say the various $\ell_p$ norms of the vector $\alpha$?
Equivalently, if one renormalizes $\alpha$ to get a probability distribution $p$ over $[k]=\{1,2,\dots,k\}$, and denote by $p_S$ the conditional distribution induced by $p$ on $S$, this is asking about $\mathbb{E}_S \lVert p_S\rVert_\infty$.
I have tried to analyze this, but the denominator makes every attempt either go haywire or seem way too contrived to continue. I feel it should either be known or have an elegant solution, however.
A positive answer (depending on the answer, of course, but things are what they are) would possibly greatly simplify a proof I am working on.
Assuming (without loss of generality) that $\sum_{i=1}^n \alpha_i=1$, you can define $V= Y/X$ with $Y = \max_{i \in S} \alpha_i$ and $X = \sum_{i=1}^k \alpha_i I_i$ with $I_i$ an indicator function that is $1$ if $i \in S$, zero else. Then $1\geq V\geq 1/s$ always, and by Jensen's inequality for the concave function $\log(\cdot)$: \begin{align}\log(E[V]) &\geq E[\log(V)] \\ &= E[\log(Y)] - E\left[\mbox{$\log\left(\sum_{i=1}^k \alpha_i I_i\right)$}\right]\\ &\geq E[\log(Y)] - \mbox{$\log(\sum_{i=1}^k \alpha_i E[I_i])$} \\ &= E[\log(Y)] - \log(s/k) \end{align} So $$1\geq E[V] \geq \max\left\{(k/s)\exp(E[\log(Y)]), 1/s\right\}$$
For a simple example with $k=4, s=2$: $$ \{\alpha_i\} = \{1/8, 1/8, 1/4, 1/2\}$$ $$E[\log(Y)] = (1/6)\log(1/8) + (2/6)\log(1/4) + (3/6)\log(1/2) $$ So $$ E[V]\geq \max\left\{(k/s)\exp(E[\log(Y)]), \underbrace{1/s}_{0.5}\right\} = 0.6299605249474366$$ But the exact answer is \begin{align} E[V] &= \frac{1}{6}\left[\frac{1/8}{1/8+1/8} + \frac{1/4}{1/8+1/4} + \frac{1/2}{1/8+1/2}\right] \\ &\quad +\frac{1}{6}\left[ \frac{1/4}{1/8+1/4}+\frac{1/2}{1/8+1/2}+\frac{1/2}{1/4+1/2}\right]\\ &=41/60 \\ &= 0.6833333333 \end{align}
Assuming $\alpha_1 \leq \alpha_2 \leq ... \leq \alpha_k$ we get $$ E[\log(Y)]= \frac{1}{{k}\choose{s}}\sum_{i=s}^k \log(\alpha_i){{i-1}\choose{s-1}} $$ and so $$ \exp(E[\log(Y)]) = \prod_{i=s}^k \alpha_i^{{{i-1}\choose{s-i}}/{{k}\choose{s}}}$$
Other simple bounds are: Assume $\alpha_1\leq \alpha_2\leq...\leq \alpha_k$ and define $z = \alpha_1 + ... + \alpha_{s-1}$. Then: $$ \frac{1}{{k}\choose{s}}\sum_{i=s}^k \left(\frac{\alpha_i}{\sum_{j=i-s+1}^i \alpha_j}\right) {{i-1}\choose{s-1}}\leq E[V] \leq \frac{1}{{k}\choose{s}}\sum_{i=s}^k \left(\frac{\alpha_i}{\alpha_i+z}\right) {{i-1}\choose{s-1}} $$ For the example $\{1/8, 1/8, 1/4, 1/2\}$ these bounds give $$ 0.63888888888 \leq E[V]\leq 0.705555555 $$