We know from basic linear algebra that $\forall x \neq 0, \frac{||x||_2}{||x||_{\infty}} \leq \sqrt{n}$ (where $n$ is the dimension).We also know that the equality occurs if and only if all coordinates are equal.
When, on the contrary, all coordinates are $0$ except one, then $\frac{||x||_2}{||x||_{\infty}} = 1$.
It appears that the more distant the coordinates are, the smaller this ratio.
I am looking for an (in)equality linking $\frac{||x||_2}{||x||_{\infty}}$ with $\sigma(x)$ the standard deviation of the $x_i$, or another measure of how distant the coordinates are.
"It appears that the more distant the coordinates are, the smaller this ratio."
That is not really correct, you can try an almost identical case when all coordinates equal and non-zero except one that is zero. You will get $\frac{||x||_2}{||x||_{\infty}} = \sqrt{n-1}$.
In terms of connection with standard deviation, let $\mu=\frac{\sum\limits_ix_i}{n}$ than $\sigma^2=\frac{||x||_2^2}{n}-\mu^2$. So you can have for example: $\frac{||x||_2^2}{||x||_{\infty}^2}=n\left(\frac{\sigma^2}{||x||_{\infty}^2}-\frac{\mu^2}{||x||_{\infty}^2}\right)$, but $\mu^2\leq \frac{||x||_2^2}{n} $ (https://en.wikipedia.org/wiki/Generalized_mean) so:
$\frac{||x||_2^2}{||x||_{\infty}^2}\geq -\frac{||x||_2^2}{||x||_{\infty}^2}+n\frac{\sigma^2}{||x||_{\infty}^2}$
$\frac{||x||_2^2}{||x||_{\infty}^2}\geq n\frac{\sigma^2}{2||x||_{\infty}^2}$