I'm facing a new area of math problems in linear algebra with the eigen decomposition.
The fact that $A$ has an eigen decomposition is given, so that: $$A = \sum_{i=1}^n{\lambda_i x_i y_i^T} = \sum_{i=1}^n{\lambda_iE_i} $$ also: $$y_i^Tx_i = 1 $$ whereas $x_i$ is an eigen vector from right (culomn), and $y_i^T$ is an eigen vector from left (line).
I also need to prove that for every vector $\vec{x}$ that's not an eigenector: $$ (xI-A)^{-1} = \sum_{i=1}^n{\frac{1}{x- \lambda_i}E_i}$$ But I get the feeling that the second part somehow depends on the first one.
I know that if $A$ has an eigen decomposition it means it has n eigenvectors, and it is similar to a diagonal matrix D. by the way the question is asked, I get the feeling $A$ itself is a diagonal matrix.
I tried adding multiplication with $y_i^T$ or with $x_i$ from the left or right and see how the fact that $y_i^Tx_i = 1 $ serves me, but I really got to nowhere.
I really want to solve it myself, but this topic is clearly not 100% sunk down, so I'd appreciate a guidance to the solution.