Prove $\|x_1 + \cdots + x_m\|_p \leq m^{\text{max}\{0, 1/p - 1\}}(\|x_1\|_p + \cdots + \|x_m\|_p)$ for $p \in (0, \infty), x_i \in \mathbb{C}^n$.

68 Views Asked by At

This is a problem or fact presented in some compressed sensing literature as the $\ell^p$ quasinorm $(p \in (0,1])$ is of particular interest for sparse signal reconstruction. To address the problem, I suspect the previously introduced results will be needed or useful:

(1). For $0 < p < 1$, $\|x_1 + x_2\|_p^p \leq \|x_1\|_p^p + \|x_2\|_p^p$, $x \in \mathbb{C}^n$

(2). For $0 < q < p \leq \infty$, $\|x\|_p \leq \|x\|_q \leq n^{1/q - 1/p}\|x\|_p$, $x \in \mathbb{C}^n$.

First off, for $p \geq 1$, $\text{max}\{0, 1/p -1\} = 0$, and so the result just becomes the standard triangle inequality of a norm. For the more interesting part, suppose $p \in (0,1)$. If I look at fact (2), noticing that $0 < p < 1$, then

$$\|x\|_p \leq n^{1/p-1}\|x\|_1$$

which looks encouraging. I see another useful fact in that, for an appropriate choice of vector, the sum $\sum_{i=1}^m \|x_i\|^p$ could be regarded as the $\ell^1$ norm. For example, if $m=n$ and $e \in \mathbb{C}^n$, $e = (1, 1, \ldots, 1)$ and $e_i$ are the appropriate standard basis vectors, then $\|e\|_1 = \sum_{i=1}^n \|e_i\|_p^p$. Of course, this example can be extended beyond just $m=n$.

For example, if I suppose $m<n$ and I choose $x_1 = e \in \mathbb{C}^m$, then I get $\|x_1\|_p \leq m^{1/p - 1}\|x_1\|_1$, which looks suspicously close to the result I want. Furthermore, I could now consider $x_2 \in \mathbb{C}^n, x_2 = (x_1, 0, \ldots, 0)$. If I choose $y_i \in \mathbb{C}^n$ where $y_i = (e_i,0, \ldots,0)$ and $e_i$ is still the $i^{th}$ standard basis vector in $\mathbb{C}^m$, then

$$\|y_1 + \cdots + y_m\|_p \leq m^{1/p-1}\|y_1 + \cdots + y_m\|_1$$

which looks even closer to the result I want given $y_i \in \mathbb{C}^n$.

Somehow I just can't figure out how these facts piece together to give the final result. If anyone has any hints or solutions it would be great to see how it all comes together!

1

There are 1 best solutions below

1
On BEST ANSWER

This follows from the triangle inequality (1) and an application of Hölder's inequality with exponent $1/p.$ Indeed if $0 < p < 1$ by repeatedly applying (1) we have $$ \lVert x_1 + \dots + x_m \rVert_p^p \leq \lVert x_1 \rVert_p^p + \dots + \lVert x_m \rVert_p^p, $$ and applying Hölder to the right-hand side with exponents $1/p$ and $1/(1-p)$ we get $$ \sum_{i=1}^m\lVert x_i \rVert_p^p \leq \left(\sum_{i=1}^m\left(\lVert x_i \rVert_p^p\right)^{\frac1p} \right)^p \left(\sum_{i=1}^m 1^{\frac1{1-p}}\right)^{1-p} = m^{1-p} \left(\sum_{i=1}^m\lVert x_i \rVert_p \right)^p, $$ so raising both sides to the power $1/p$ gives $$ \lVert x_1 + \dots + x_m \rVert_p \leq m^{\frac1p -1} \left( \lVert x_1 \rVert_p + \dots + \lVert x_1 \rVert_p \right),$$ as required.