Derivation of why a diagonalizable matrix can be written as a sum of outer products $\Sigma=\sum_{i=1}^n \lambda_i v_i v_i^T$

813 Views Asked by At

Lets say we have a symmetric positive semi-definite $n\times n$ matrix $\Sigma$, which therefore has a diagonalization $\Sigma=V\Lambda V^T$, where $V$ is an orthogonal matrix (containing the eigenvectors $v_1, \ldots, v_n$ of $\Sigma$) and $\Lambda$ is a diagonal matrix (containing the eigenvalues $\lambda_1, \ldots, \lambda_n$ of $\Sigma$).

Could you show me how the following sum of outer products is derived?

$$\Sigma=\sum_{i=1}^n \lambda_iv_iv_i^T$$

1

There are 1 best solutions below

3
On BEST ANSWER

It helps to have internalized the following ways of looking at matrix multiplication:

  • $Ax = a_1 x_1 + \cdots + a_n x_n$, where $a_i$ is the $i$th column of the matrix $A$ and $x_i$ is the $i$th component of the vector $x$.
  • $A \begin{bmatrix} b_1 & \cdots & b_n \end{bmatrix} = \begin{bmatrix} A b_1 & \cdots & A b_n \end{bmatrix}$, where $b_i$ is the $i$th column of the matrix $B$.
  • $\begin{bmatrix} a_1 & \cdots & a_n \end{bmatrix} \begin{bmatrix} b_1^T \\ \vdots \\ b_n^T \end{bmatrix} = a_1 b_1^T + \cdots + a_n b_n^T$. Here the vector $a_i$ is the $i$th column of $A$ and the row vector $b_i^T$ is the $i$th row of $B$.

You can use rules 2 and 1 to compute $V \Lambda$, and then use rule 3 to compute $V \Lambda V^T$.

Alternatively, you can think of it like this. The matrix $v_i v_i^T$ projects a vector $x$ onto the span of $\{ v_i \}$. This projection then gets scaled by $\lambda_i$. In other words: \begin{equation} x = \sum_{i=1}^n v_i v_i^T x \end{equation} so \begin{align} \Sigma x &= \sum_{i=1}^n \Sigma v_i v_i^T x \\ &= \sum_{i=1}^n \lambda_i v_i v_i^T x. \end{align}