Are there many ways to decompose a matrix, ${\bf{A}} \in \mathbb{R}^{n \times n}$, into a sum of rank-$1$ matrices?
I ask because I know that if a matrix is symmetric and diagonalizable, $\bf{A}=\Phi\Lambda\Phi^{T}$, then you can use the eigendecomposition to form a sum of rank-$1$ matrices:
$$\bf{A} = \sum_{i=1}^{n}\lambda_i \phi_i\phi_i^T\,.$$
It is also true that you can always use the singular value decomposition (SVD) to do this, as well:
$$\bf{A} = \sum_{i=1}^n \sigma_i \bf{u}_i\bf{v}_i^T\,.$$
So, unless these happen to be equal, which is not true in general, then I have found two ways already. Are there infinitely many ways? What is an example of another decomposition of this form?
Yes, there exists an infinity of such decompositions as a sum of rank-1 matrices.
It suffices to use any factorisation of $A$ as a product of 2 matrices:
$$A=MN=\sum_{i=1}^n C_i L_i \tag{1}$$
where $C_i$ is the $i$th column of matrix $M$ and $L_i$ is the $i$th line of matrix $N$
Why are there an infinity of ways to write (1) ? It suffices to take any invertible matrix $M$, and to associate it $N:=M^{-1}A$.
Remark : in order to understand (1), let us consider the $2 \times 2$ case :
$$\begin{pmatrix}a&b\\c&d\end{pmatrix}\begin{pmatrix}e&f\\g&h\end{pmatrix}=\begin{pmatrix}a\\ c\end{pmatrix}\begin{pmatrix}e&f\end{pmatrix}+\begin{pmatrix}b\\d\end{pmatrix}\begin{pmatrix}g&h\end{pmatrix}$$