What is the expected $L^\infty$ norm of a random affine subspace?

19 Views Asked by At

I am faced with the following problem. Consider $v$ random in $\mathbb{R}^n$, say $\sim \mathcal{N}(0,I_n)$. Then I know probably (for $n$ large), $||v||_\infty = O(\sqrt{n \log n}) $, as that is the maximum of $n$ standard Gaussians. However, now consider picking an orthonormal basis $B = (\frac{v}{||v||}, e_2,..e_n)$ of $\mathbb{R}^n$ and choosing $C \subseteq B $ by placing each vector $e_2,..,e_n$ independently with probability $1/2$ in $C$. Now what is the expected value $$ \mathbb{E} \min_{w \in \text{span}(C)} || v + w ||_\infty ?$$ I would be interested in showing that this is $O(\sqrt{n})$, but I'm not certain this holds. Any ideas ?