Upper bound for $\frac{\|x\|_1}{\|x\|_2}$ if each entry of $x\in R^d$ is i.i.d. sampled from Gaussian distribution $N(0,1)$

333 Views Asked by At

In the question, $\|x\|_1=\sum_{i=1}^d|x_i|$ with $|\cdot|$ being the absolute value, and $\|x\|_2=\sqrt{\sum_{i=1}^d x_i^2}$.

In general, $\frac{\|x\|_1}{\|x\|_2}\leq \sqrt{d}$ always holds for arbitrary $x\in R^d$.

However, if each entry of $x\in R^d$ is independently and identically sampled from Gaussian distribution $N(0,1)$, then I am interested in if we can get a tighter upper bound for $\frac{\|x\|_1}{\|x\|_2}$ samller than $\sqrt{d}$ with a certain probability? e.g., $P[\frac{\|x\|_1}{\|x\|_2}\leq t]\geq 0.9$ and $t<\sqrt{d}$.

Hope the link helps. https://en.wikipedia.org/wiki/Concentration_inequality

1

There are 1 best solutions below

0
On

This is really about bounding the $L_1$ norm of a Gaussian vector. Indeed, the $L_2$ norm is $\Theta(\sqrt{d})$ w.p $1-o(1)$. The problem is reduced to concentrating the $L_1$-norm of Gaussian random vector, for which there should be nurmerous sources on the internet telling you that $\|x\|_1 = \Theta(d)$ w.p $1-o(1)$. From this, you can conclude that $\dfrac{\|x\|_1}{\|x\|_2} = \dfrac{\Theta(d)}{\Theta(\sqrt{d})}=\Theta(\sqrt{d})$ w.p $1-o(1)$.

In order words, the generic bound is tight for Gaussian random vector !

N.B.: A good reference on such matters is Vershynin's book High-dimensional Probability (pdf version available online).