In Gilbert Strang's "Linear Algebra and Learning from Data" he asks the question
Why do $A$ and $A^+$ have the same rank? If $A$ is square, do $A$ and $A^+$ have the same eigenvectors? What are the eigenvalues of $A^+$?
I've managed to answer the first two questions:
- Using the SVD one can see that the $r$ positive singular values of $A$ lead to $r$ positive singular values for $A^+$ via inversion.
- $A=\begin{bmatrix}1 & 2 \\ 1 & 2 \end{bmatrix}$ has eigenvectors $[1,1]^T$ and $[2,-1]^T$. $A^+=\begin{bmatrix} 1/10 & 1/10 \\ 1/5 & 1/5 \end{bmatrix}$ doesn't have any of these as eigenvectors.
Regarding the eigenvalues of $A^+$: I tried playing around with some particular matrices, but I couldn't see a simple relationship with the eigenvalues of $A$. I suspect there is no such relationship between the eigenvalues. Is that the case?
Thanks.
I might have a partial answer. Please don't downvote, just mark the mistake if there is one.
Let v be an eigenvector of A and suppose its corresponding eigenvalue $\lambda$ is not zero. Because $\lambda$ is different from zero, we have that v is in the row space of A ($Row(A) = Null(A)^{\perp}$). Page 125 of states "$A^{+}Ax=x$ exactly when x is in the row space". Hence $A^{+}Av=v$. Thus $A^{+}Av=A^{+}\lambda v$ implies $\frac{1}{\lambda}v = A^{+}v$. Which means $\frac{1}{\lambda}$ is an eigenvalue of $A^+$. Note that $A^+v$ makes sense because we are refering to a square matrix A (in order to talk about eigenvalues).
On the other hand, if we let $\lambda = 0$ (recall v cannot equal the zero vector) we then have $A^{+}Av=A^{+}\lambda v= \lambda A^{+}v = 0$. Which implies $\lambda$ is also an eigenvalue of $A^{+}$.
Thus the eigenvalues of $A^{+}$ are the reciprocal of those of A, or zero when it is the case. However i don't know how to show that these are the only eigenvalues. I suppose it is thanks to the definition of pseudoinverse.