Question:
Let $\lambda$ be an eigenvalue of $AB$; $B$ is invertible. Is $\lambda$ also an eigenvalue of A?
My thoughts:
I think it can happen iff $B$ is identity matrix.
If $B$ is the identity it's clear that $Av = AIv = ABv = \lambda v$
On the other direction: if $ABv = \lambda v \rightarrow Av = \lambda v$
then $BABv = \lambda Bv$ which equals $BAw=\lambda w \rightarrow Aw=\lambda B^{-1}w$ and then $w=B^{-1}w$ because we assume $\lambda$ is an eigenvalue of $A$. so $B^{-1}$ must be identity.
Is this right?
No. If $A \,:\, \mathbb{R} \to \mathbb{R}\,:\, x \to 2x$, and $B \,:\, \mathbb{R} \to \mathbb{R}\,:\, x \to \frac{1}{2}x$, then $AB$ is the identical mapping and thus has eigenvalue $1$, yet $A$ has eigenvalue $2$ and $B$ has eigenvalue $\frac{1}{2}$.
Now, if $B$ is the identical matrix, then $AB = A$, and so yes, $AB$ and $A$ have the same eigenvalues in this case. Heck, how could they not - they represent the same function...
The converse, i.e. that if $AB$ and $A$ have the same eigenvalues then $B$ must be the identity, is wrong. Consider $A\,:\, (x,y) \to (x,x+y)$ and $B\,:\, (x,y) \to (x,y-x)$. Then $A$ has $1$ as its only eigenvalue (for eigenvector $(1,0)$), $B$ simiarly has $1$ as its only eigenvalue, and $AB$ is the identity and thus also has $1$ as its only eigenvalue. So the set of eigenvalues of $AB$ and $A$ agree, yet $B$ isn't the identity.
In that converse, note that you can't in general build a basis from just eigenvectors of a mapping - a $n\times n$ matrix can have less than $n$ eigenvectors. Your proof seems to assume that, though...