I have a proof to do and I am stuck on proving that if there exist a matrix $A$ with eigenvalue $\lambda_i$ and $B$ with eigenvalues $\mu$ such that
$A = (B+I)(B-I)^{-1}$ then we have $\lambda=\frac{\mu+1}{\mu-1}$.
Thank you for the help.
I have a proof to do and I am stuck on proving that if there exist a matrix $A$ with eigenvalue $\lambda_i$ and $B$ with eigenvalues $\mu$ such that
$A = (B+I)(B-I)^{-1}$ then we have $\lambda=\frac{\mu+1}{\mu-1}$.
Thank you for the help.
Copyright © 2021 JogjaFile Inc.
We know that if a matrix $\;T\;$ is invertible, then $\;Tu=\alpha u\implies T^{-1}u=\alpha^{-1}u\;$ , so if $\;Bu=\mu u\;$ , then $\;(B\pm I)u=(\mu\pm 1)u\;$ , and from here:
$$(B-I)^{-1}u=(\mu-1)^{-1}u\implies$$
so assuming the eigenvector is common we get
$$\lambda u=Au=(B+I)(B-I)^{-1}u=(B+I)((\mu-1)^{-1}u)=(\mu-1)^{-1}(B+I)u=\frac{\mu+1}{\mu-1}u$$
Deduce now the final line of the proof.