SVD decomposition of a matrix

478 Views Asked by At

I have a question regarding singular value decomposition of a matrix which I have not been able to answer. Why the matrices in the svd decomposition of a matrix, are one rank matrices?

2

There are 2 best solutions below

0
On

Why the matrices in the svd decomposition of a matrix, are one rank matrices?

I think you mean to ask why the SVD is a sum of rank-one matrices. That is why if $A = U \Sigma V^{T}$ then we have that

$$ A = \sum_{i=1}^{n} \sigma_{i} u_{i} v_{i}^{T}$$

You can see $u_{i} v_{i}^{T}$ as the matrix generated by the outer product of the vectors $u_{i}, v_{i}$ which are the left and right singular vectors respectively. Then $\sigma_{i}$ acts as a scaling factor along these new axes.

0
On

Further to @copper.hat comment, the Matrices of the SVD are not necessarily rank 1. In fact, matrices could be anywhere between 1 and $n$. Recall the definition of Singular Value Decomposition,

A matrix $A \in \mathbb{C}^{m \times n}$ has a singular value decomposition $A=U\Sigma V^*$, where $U\in \mathbb{C}^{m \times m}, V \in \mathbb{C}^{n \times n}$ are unitary and $\Sigma = \text{diag}(\sigma_1, \dots , \sigma_p)\in \mathbb{R}^{m \times n}, p = \min (m,n),$ where $\sigma_1 \ge \sigma_2 \ge \dots \sigma_p \ge 0$.

Since the matrix $\Sigma$ is defined by its leading diagonal, the rank of $A$ (and $\Sigma$) is equal to the number of non-zero $\sigma_i$ values ($1\le i \le p$).

So the maximum special case is when $A$ is a square matrix of $\text{rank}(A)=n$. In this case we actually have the special case of Schur's theorem. Where $A$ is a normal matrix ($A^* A=AA^* $) too.

Schur's Theorem is:

Let $A\in \mathbb{C}^{n \times n}$. Then there exists a unitary matrix $U$ and an upper triangular matrix $T$ such that $T = U^* A U$.

The fact that $A$ is normal in this case means that it is diagonalisable and $T$ is diagonal with values equal to the eigenvalues of $A$.

In summary, the SVD is a type of a similarity transformation on a Rectangular matrix. The SVD is possibly the most important decomposition of a matrix as it shows all characteristics of a matrix, its singular values and left and right eigenvectors. If you are interested in this further, I'd recommend Gilbert Strang's description of the SVD and it's power in the new MIT course Learning from Data - lectures available here: https://www.youtube.com/watch?v=rYz83XPxiZo