Given $N\times N$ matrix $A$,I see a formula in literature without saying other requirements
$$A= \sum_{i}\lambda_i R^T_i L_i $$ where $\lambda_i$ are eigenvalue and $R_i$ and $L_i$ are right and left eigenvector with eigenvalue of $\lambda_i$.
I'm confused about this formula. First I didn't see this formula in my linear algebra course. Second there are so much ambiguities in this formula: What's about $A$ that is non-diagonalizable, i.e. eigenvector's number is less than the eigenvalues' number? What's about there are degenercy in some $\lambda_i$ so how to pair $R_i$ and $L_i$?
So where can I find the complete statement of this formula? What's the name of this decomposion?
As stated, the formula makes no sense (at least for an arbitrary matrix). But it heavily resembles the Singular Value Decomposition.
There is another oddity in the formula you wrote, in that it seems to write vectors as horizontal matrices; this is really odd because then it forces you to write the usual "matrix times vector" as $Ax^T$.
The Singular Value Decomposition is $A=UDV$, where $U,V$ are unitaries (orthogonal if your matrix is real) and $D$ is diagonal with the singular values in the diagonal. If $e_1,\ldots,e_n$ is the canonical basis, we can write $$ D=\sum_{j=1}^n\sigma_j\,e_je_j^T. $$ Thus $$ A=\sum_{j=1}^n\sigma_j\,Ue_j\, (Ve_j)^T=\sum_{j=1}^n\sigma_j\, x_jy_j^T, $$ where $x_1,\ldots,x_n$ and $y_1,\ldots,y_n$ are two orthonormal bases.