I have this problem over here, which I could solve only the first half, but in the other one I don't know if my solution is enough or not, or even if it is correct. The second part of the problem is:
There are two matrix $n\times n$ $B$ and $U$. $U$ is an invertible matrix and all of the eigenvalues of $B$ are different. If we define a matrix as $B'=U^{-1}B U$, knowing that the eigenvalues of $B$ are $\lambda_i$, and the eigenvectors of $B$ and $B'$ are $\vec{v}_i$ and $\vec{v}_i'$ respectively, show the relationship between the eigenvectors.
What I did was writing the definition of the eigenvectors like this, as the eigenvalues of $B$ and $B'$ are equal.
$(B-\lambda _iI)\vec{v}_i=0$
$(U^{-1}BU-\lambda _iI)\vec{v}_i'=0$
Therefore I suppose I can equal both of them $(B-\lambda _iI)\vec{v}_i=(U^{-1}BU-\lambda _iI)\vec{v}_i'$, but I don't know what else I can do with this, as I cannot pass a matrix to the other term because they are not invertible (by definition the determinant is 0). Is this legit? And if it is, do you think this would be enough for the solution?
Every invertible matrix is a change of basis. So, if $B' = U^{-1}BU$, then $B$ and $B'$ represent the same linear transformation, only on different basis. To be more clear, let $\alpha$ and $\beta$ be basis of your space, and take $U$ as a change of basis from $\alpha$ to $\beta$, and so $U^{-1}$ is a change of basis from $\beta$ to $\alpha$.
Let $v$ be a eigenvector of $B'$, with eigenvalue $\lambda$. Write $[v]_\alpha$ for this vector wrote as a linear combination of vectors in $\alpha$, and $[v]_\beta$ for this vector wrote as a linear combination of vectors in $\beta$. In this case, $B'[v]_\alpha = \lambda [v]_\alpha$, so $$U^{-1}BU[v]_\alpha = \lambda [v]_\alpha.$$ This last equation yields $BU[v]_\alpha = \lambda U[v]_\alpha$. But $U[v]_\alpha = [v]_\beta$, so we have $B[v]_\beta = \lambda [v]_\beta$, and this is the relation between the eigenvectors: They are the same vector, only on a different choice of basis.