Does this matrix decomposition have a generalization to arbitrary dimensions?

129 Views Asked by At

Suppose $A$ is a square, $n\times n$ matrix with eigenvalues $\lambda_1,\dotsc,\lambda_r$ and corresponding eigenspaces $\chi_1,\dotsc,\chi_r$. Let $P_j$ be the projection with range $\chi_j$ and kernel span$\{\chi_1,\dotsc,\chi_{j-1},\chi_{j+1},\dotsc,\chi_r\}$. It seems like we can represent $A$ by \begin{align} A=\lambda_1P_1+\cdots+\lambda_r P_r \end{align} and that this is a very intuitive representation of $A$ in terms of its eigenvalues and eigenspaces. Does this representation have a generalization to arbitrary $m\times n$ matrices? I am asking because I have always thought about the spectral theorem for symmetric matrices in terms of the decomposition above, where the $P_j$ would be orthogonal projections. Now, I am learning about singular value decomposition as a generalization of that spectral theorem and I'm wondering if there is a similar interpretation involving projections onto and along eigenspaces (no explanation of SVD I've come across has mentioned projections of any kind).

1

There are 1 best solutions below

0
On BEST ANSWER

This is incorrect; your desired formula holds if and only if $A$ is diagonalizable, and in general the behavior of $A$ is more complicated. See Jordan normal form for some details.

The analogue of this decomposition for an $m \times n$ matrix is the following. Let $M$ be an $m \times n$ real or complex matrix, with singular values $\sigma_i$ and left and right singular vectors $u_i, v_i$. In particular, the $u_i$ are an orthonormal basis, the $v_i$ are also an orthonormal basis, and

$$M v_i = \sigma_i u_i.$$

Then we can write

$$M = \sum \sigma_i u_i v_i^{\ast}$$

where $v_i^{\ast}$ is the linear map $\langle v_i, - \rangle$. If $M$ is square and Hermitian then $u_i = v_i$ up to sign so we get the decomposition into orthogonal projections implied by the spectral theorem as expected.