The Gaussian width of a set $T\subset \mathbb{R}^n$ is defined as, $$ G(T) = E\left[\sup_{\theta \in T} \sum_{i=1}^n \theta_i W_i\right], $$ where, $\mathbf{W}=(W_1,\ldots,W_n)$ is a sequence of i.i.d. $N(0,1)$ random variables. I am interested in finding the value of $G(T)$ for $$ T(s) \equiv \{\theta\in\mathbb{R}^n: {\|\theta\|}_0 \leq s,{\|\theta\|}_2\leq 1\}, $$ the set of all $s$-sparse vectors within the unit ball, with $s\in\{1,\ldots,n\}$. This is an exercise problem in Wainwright's book on HD-Statistics. I have been able to show,
$$ G(T(s)) = E\max_{|S|=s} {\|\mathbf{W}_S\|}_2, $$ and $S$ is a subset of $\{1,\ldots,n\}$, with cardinality $|S| = s$. Here the subscript $S$ denotes the components of $\mathbf{W}$ corresponding to $S$.
Then, using Gaussian concentration inequality and the union bound, I can get, $$ P\left(\max_{|S|=s}{\|\mathbf{W}_S\|}_2 \geq \sqrt{s} + t\right)\leq \binom{n}{s} \exp\{-t^2/2\},\ \text{for all $t>0$.} $$ I can use the bound, $$ \binom{n}{s}\leq {\left(ne/s\right)}^s, \ \text{for all $s=1,\ldots,n$.} $$ Finally, I need to integrate to obtain the bound on the expectation. I am unable to do it to get the desired upper bound (of the order), $$ K\sqrt{s\log(en/s)},\ \text{where $K$ is some constant.} $$ Any ideas would be helpful!
After some effort I obtained the following. Please comment.
Let $T = \max_{|S|=s}{\|\mathbf{W}_S\|}_2$. Then, $P(T>\sqrt{s}+ t)\leq {(ne/s)}^s \exp\{-t^2/2\}$, for all $t>0$. Then, for some $a>\sqrt{s}$ (to be chosen later), \begin{eqnarray*} E(T) & = &\int_0^a P(T\geq t)~dt + \int_a^\infty P(T\geq t)~dt \\ & \leq & a +{(ne/s)}^s \sqrt{2\pi}\cdot P(Z > a-\sqrt{s}),\ \text{where, $Z\sim N(0,1)$,}\\ & \leq & a +{(ne/s)}^s \sqrt{2\pi}\cdot \exp\{-{(a-\sqrt{s})}^2/2\},\ \text{using a Subgaussian one-sided tail bound,}\\ & = & \frac{1}{2}\left[2e^{\log{(a)}} + 2\exp\left\{s\log{(ed/s)}+\log{\sqrt{2\pi}}-\frac{1}{2}{(a-\sqrt{s})}^2\right\}\right]. \end{eqnarray*} This provides an upper bound on $E(T)$ and we will minimize this upper bound in terms of $a$. Using AM-GM inequality, this bound is minimized if the terms are equal. This is same as requiring, $$ a = s\log{(ed/s)}+\log{\sqrt{2\pi}}-\frac{1}{2}{(a-\sqrt{s})}^2. $$ With some additional work by re-writing the above equation, this is equivalent to requiring, \begin{eqnarray*} a &=& \sqrt{s}+\sqrt{2s\log{(ed/s)}}\cdot{\left[1+\frac{1+2\log{\sqrt{2\pi}}}{2s\log{(ed/s)}}-\frac{1}{2\sqrt{s}\log{(ed/s)}}\right]}^{1/2} - 1 \\ &\leq & K\sqrt{s\log{(ed/s)}}, \end{eqnarray*} for some sufficiently large $K>0$, when $s$ is large enough. Since $a$ is the desired minimized upper bound on $E(T)$, the claim follows.
Earlier, I had used the sharper Mills inequality tail bound instead of the sub-Gaussian bound and ended up with a very complicated expression.