Is there an intuitive meaning for the spectral norm of a matrix? Why would an algorithm calculate the relative recovery in spectral norm between two images (i.e. one before the algorithm and the other after)? Thanks
Meaning of the spectral norm of a matrix
59.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Let us consider the singular value decomposition (SVD) of a matrix $X = U S V^T$, where $U$ and $V$ are matrices containing the left and right singular vectors of $X$ in their columns. $S$ is a diagonal matrix containing the singular values. A intuitive way to think of the norm of $X$ is in terms of the norm of the singular value vector in the diagonal of $S$. This is because the singular values measure the energy of the matrix in various principal directions.
One can now extend the $p$-norm for a finite-dimensional vector to a $m\times n$ matrix by working on this singular value vector:
\begin{align} \|X\|_p &= \left( \sum_{i=1}^{\text{min}(m,n)} \sigma_i^p \right)^{1/p} \end{align}
This is called the Schatten norm of $X$. Specific choices of $p$ yield commonly used matrix norms:
- $p=0$: Gives the rank of the matrix (number of non-zero singular values).
- $p=1$: Gives the nuclear norm (sum of absolute singular values). This is the tightest convex relaxation of the rank.
- $p=2$: Gives the Frobenius norm (square root of the sum of squares of singular values).
- $p=\infty$: Gives the spectral norm (max. singular value).
The spectral norm (also know as Induced 2-norm) is the maximum singular value of a matrix. Intuitively, you can think of it as the maximum 'scale', by which the matrix can 'stretch' a vector.
The maximum singular value is the square root of the maximum eigenvalue or the maximum eigenvalue if the matrix is symmetric/hermitian