Prove that $L_{2,1}$ of a matrix is sum of $L_2$ norm of column vectors.

306 Views Asked by At

Definition: Norm $\Vert\cdot\Vert_{2,1}$ of a matrix $A \in \mathbb{R}^{m \times n}$ is defined by $$\Vert A \Vert_{2,1} = \max\left\{\Vert Ax \Vert_1 :x \in \mathbb{R^n}, \Vert x \Vert_2 = 1\right\}$$

Problem: Given matrix $A \in \mathbb{R}^{m \times n}$. Prove that $\Vert \cdot \Vert_{2,1}$ can also be calculate as $$\Vert A \Vert_{2,1} = \sum_{i=1}^{n}\Vert A_i\Vert_2,$$ where $A_i$ are column vectors of $A$.

My attempt: Let $x\in \mathbb{R}^n$ such that $\Vert x \Vert_2=1$. Firstly, we have $$Ax = \begin{bmatrix}\langle R_1,x\rangle\\ \ldots\\ \langle R_m,x\rangle \end{bmatrix},$$ where $R_i$ are rows vector of $A$. Therefore, we claim that $$\Vert Ax \Vert_1 = \vert \langle R_1,x\rangle \vert + \ldots + \vert \langle R_m,x\rangle\vert.$$ Applying Cauchy-Schwarz we have $$\Vert Ax \Vert_1 \le \Vert R_1\Vert_2\Vert x \Vert_2 + \ldots + \Vert R_m\Vert_2 \Vert x \Vert_2 = \Vert R_1\Vert_2 + \ldots + \Vert R_m \Vert_2.$$ It seem like the 2,1-norm is equal to 2-norm of row vectors, not column vectors. Anyone can show why my proof is wrong and hint the way to prove this to me?

1

There are 1 best solutions below

6
On

I think the claim is wrong:

Let $A=I_n$ be the identity matrix. Then the maximum in $\max \{\|x\|_1 : \ \|x\|_2=1\}$ is attained at $x=(\frac1{\sqrt n}, \dots, \frac1{\sqrt n})$. Then $\|x\|_1=\|A\|_{2,1}=\sqrt n$.

But $\sum_{i=1}^n \|A_i\|_2=n$. (Here, it does not matter whether we take norms of columns or rows of $A$)