Show that $\frac{1}{\sqrt{n}}\lVert x\rVert_1\le\lVert x\rVert_2\le \lVert x\rVert_1$

62 Views Asked by At

$\frac{1}{\sqrt{n}}\lVert x\rVert_1\le\lVert x\rVert_2\le \lVert x\rVert_1$

My idea is to show first that $\lVert \vec x\rVert_\infty \le \lVert \vec x\rVert_2 \le \sqrt{n}\lVert \vec x\rVert_\infty$, and then I could use $\lVert \vec x\rVert_\infty \le \lVert \vec x\rVert_1 \le n\lVert \vec x\rVert\infty$.

Or maybe, in order to use Cauchy-Schwarz inequality, do I need to propose a vector $\vec x \in \Bbb R^n$? I'm very lost at this, don't have idea. Any help on how to start this demonstration would be very helpful. Thanks!

2

There are 2 best solutions below

4
On

You could show that $\|x\|_\infty\leqslant \|x\|_2\leqslant \sqrt{n}\|x\|_\infty$ as a first step, but then $\|x\|_\infty\leqslant \|x\|_1\leqslant n\|x\|_\infty$ doesn't get you to the conclusion you want, because you'll get at least one factor of $n$, which isn't the factor of $\sqrt{n}$ that you want.

Cauchy Schwarz (or Holder) would be good for one direction. Use $$\sum_{i=1}^n |x_i|= \sum_{i=1}^n 1\cdot |x_i|.$$

For the other direction, by homogeneity, it is sufficient to show that if $\|x\|_1=1$, then $\|x\|_2\leqslant 1$. But if $\|x\|_1\leqslant 1$, then $|x_i|\leqslant 1$ for all $i$, so we can say something about how big $|x_i|^2$ is compared to $|x_i|$, and then sum over $i$.

EDIT: The paragraphs below should be considered completely optional.

If you could show that $$\|x\|_\infty\leqslant \|x\|\leqslant \sqrt{n}\|x\|_\infty,$$ then you could actually use that $$\|x\|_1=\sup_{\|y\|_\infty=1}\sum_{i=1}^n |x_iy_i|$$ and $$\|x\|_2=\sup_{\|y\|_2=1}\sum_{i=1}^n |x_iy_i|.$$ Essentially what's happening here is we're showing that if $I:\ell_2^n \to \ell_\infty^n$ is an isomorphism with $\|I\|\leqslant 1$ and $\|I^{-1}\|\leqslant \sqrt{n}$, then its adjoint $I^*:\ell_1^n\to \ell_2^n$ which goes between the dual spaces $\ell_1^n=(\ell_\infty^n)^*$ and $\ell_2^n=(\ell_2^n)^*$ is also an isomorphism which satisfies $\|I^*\|=\|I\|\leqslant 1$ and $\|(I^*)^{-1}\|=\|(I^{-1})^*\|=\|I^{-1}\|\leqslant \sqrt{n}$.

Just to reword the last paragraph into terms from linear algebra: Let's note that $$\|x\|_2=\max\{ x\cdot y:\|y\|_2\leqslant 1\},$$ which follows from combining Cauchy-Schwarz (to get $\leqslant$) with taking $y=x/\|x\|_2$ (to get $\geqslant $), with the convention that $0/0=0$. Later you'll see this relationship summarized by saying that the dual norm to $\|\cdot\|_2$ is $\|\cdot\|_2$.

It is also true (and easier to check without needing something like Cauchy-Schwarz) that $$\|x\|_1=\max\{x\cdot y:\|y\|_\infty\leqslant 1\}$$ and $$\|x\|_\infty=\max\{x\cdot y:\|y\|_1\leqslant 1\}.$$

If we know that $\|y\|_2\leqslant \sqrt{n}\|y\|_\infty$, then we know that $$\{y/\sqrt{n}:\|y\|_\infty\leqslant 1\}\subset \{y:\|y\|_2\leqslant 2\},$$ since if $\|y\|_\infty\leqslant 1$, $$\|y/\sqrt{n}\|_2 =\frac{\|y\|_2}{\sqrt{n}}\leqslant \frac{\sqrt{n}\|y\|_\infty}{\sqrt{n}}=\|y\|_\infty\leqslant 1.$$ Then for any $x$, \begin{align*} \|x\|_1 & =\max\{x\cdot y:\|y\|_\infty\leqslant 1\}=\sqrt{n}\max\{x\cdot (y/\sqrt{n}):\|y\|_\infty\leqslant 1\} \\ &\leqslant \sqrt{n}\max\{x\cdot y:\|y\|_2\leqslant 2\}=\sqrt{n}\|x\|_2.\end{align*}

We can play a similar game (without the factor of $\sqrt{n}$) to get that $\|x\|_2\leqslant \|x\|_1$ from $\|y\|_\infty\leqslant \|y\|_2$.

0
On

If you want to be ad-hoc about it. Consider the set $B(\lVert \cdot \rVert_1)$ where $\lVert x \rVert_1 \le 1$ and then find the maximum value of $\lVert \cdot \rVert_2$ on $B(\lVert \cdot \rVert_1)$ (if you haven't understood why we can reduce to having $\lVert x \rVert_1 \le 1$, you can define $R = \lVert x \rVert_1$ and work with that). I claim that this occurs where all but one of the components are zero.

Otherwise, suppose that $x_1, x_2$ are non-zero. If we can show that $x_1^2 + x_2^2$ can be made bigger while keeping $|x_1| + |x_2|$ the same, then 1. we've kept $\lVert x \rVert_1$ the same and made $\lVert x \rVert_2$ bigger. You can do this by geometry or Lagrange multipliers or just analyzing algebraically what happens if you set $y_1 = |x_1| + |x_2|$ and $y_2 = 0$.

Having done that, we know that the maximum value of $\lVert \cdot \rVert_2$ on $B(\lVert \cdot \rVert_1)$ is $1 = \lVert (1,0,0, \dots, 0) \rVert_2$.

Then you can do a similar analysis of maximizing the value of $\lVert \cdot \rVert_1$ subject to $\lVert x \rVert_2 \le 1$ and this time show that the maximum occurs where $|x_1| = |x_2| = \dots = |x_n| = \frac1n$. Here you might want to show that we can assume that $x_i \ge 0$. The maximum value is then $(\frac1{n^2} + \dots + \frac1{n^2})^{1/2} = (\frac{n}{n^2})^{1/2} = \frac1{n^{1/2}}$.

Again, assume that $x_1 \ne x_2$ and show that you can replace $x_1, x_2$ by a common value $y$ satisfying $x_1^2 + x_2^2 = y^2 + y^2$ (so that the 2-norm is kept the same) and then show that the 1-norm is bigger. You can also try geometry and Lagrange multipliers here.

Note that all the intuition here comes from the $n = 2$ case. E.g. the fact that $x^2 + y^2$ is maximum on the set $|x| + |y| = 1$ when $x = 0$ or $y = 0$. The minimum occurs when $x = y$ but you can also phrase that as maximizing $|x| + |y|$ while keeping $x^2 + y^2 = 1$.