Do [T]EE and [T]BE have the same eigenvalues?

123 Views Asked by At

I know there are plenty of questions regarding 2 matrices one is [T]BB and one is [T]EE , so we all know that eigenvalues and eigenvectors are kept across different bases.

My question is what happens when you have a transformation from one basis to another are the eigenvalues kept the same ?

For example i have one matrix that gets vectors relative to base B and the image is returned according to base E would its eigenvalues be exactly the same as the same transformation from E to E?

1

There are 1 best solutions below

18
On

Here's the counterexample from the comments explicitly.

Let $E = ((1, 0), (0, 1))$ be the standard basis and let $B = ((2, 0), (0, 2))$. Clearly $B$ is a basis too.

Let $I_{\Bbb{R}^2}$ be the identity operator on $\Bbb{R}^2$. We now compute $\left[I_{\Bbb{R}^2}\right]_B^E$, by which I mean, the matrix for $I_{\Bbb{R}^2}$ from basis $B$ to $E$, i.e. the unique matrix $M$ such that $M[v]_B = \left[I_{\Bbb{R}^2} v\right]_E = [v]_E$ (this is also known as the change of basis matrix from $B$ to $E$).

To compute this, we transform the basis vectors in $B$ by $I_{\Bbb{R}^2}$, and express them as linear combinations of $E$. Computing, \begin{align*} I_{\Bbb{R}^2}(2, 0) = (2, 0) = 2(1, 0) + 0(0, 1) &\implies \left[I_{\Bbb{R}^2}(2, 0)\right]_E = \begin{bmatrix} 2 \\ 0\end{bmatrix} \\ I_{\Bbb{R}^2}(0, 2) = (0, 2) = 0(1, 0) + 2(0, 1) &\implies \left[I_{\Bbb{R}^2}(0, 2)\right]_E = \begin{bmatrix} 0 \\ 2\end{bmatrix}. \end{align*} These columns combine to give us $$\left[I_{\Bbb{R}^2}\right]_B^E = \begin{bmatrix} 2 & 0 \\ 0 & 2 \end{bmatrix} = 2I_{2 \times 2}.$$

Now, what are the eigenvalues for this matrix? It shouldn't be difficult to see that $2I_{2 \times 2} - \lambda I_{2 \times 2}$ is singular if and only if $\lambda = 2$ (otherwise the inverse matrix is $(2 - \lambda)^{-1}I_{2 \times 2}$). So, the one and only eigenvalue is $2$. Note how this differs from $I_{\mathbb{R}^2}$: its only eigenvalue was $1$. This completes the counterexample.


To expand on my first hint, you can actually make $\left[I_{\Bbb{R}^2}\right]_B^E$ be equal to any invertible matrix you want. All you do is set $B$ to be the columns of your matrix. As the matrix is invertible, the columns form a basis. It's easy enough to see, using the above process, that the result just gives you back the matrix you started with.

So, if your conjecture were true, then it would have to be the case that every invertible matrix has only $\lambda = 1$ as an eigenvalue. Finding even a single example of an invertible matrix without $\lambda = 1$ as its only eigenvalue (which I did with $2I_{2 \times 2}$) will therefore lead to a counterexample.


EDIT: In response to the comments below.

You're so very close to being right here, that it will make it tricky for me to explain.

I am dealing with two types of objects here: operators and square matrices. In the above answer, $I_{\Bbb{R}^2}$ is an example of the former, while $I_{2 \times 2}$ is an example of the latter. These objects both have eigenvalues, but have very slightly different notions of eigenvectors.

If $T$ is a linear operator on a finite-dimensional vector space $V$, then eigenvectors, corresponding to an eigenvalue $\lambda$, is an abstract vector $v \in V$ such that $T(v) = \lambda v$. Maybe $v$ is a polynomial, a vector, or possibly a matrix. It's an abstract vector, to which we apply $T$, and we get back the parallel abstract vector $\lambda v$.

If $M$ is a square matrix, then an eigenvector $v$ of $M$ are now (coordinate) column vectors, with entries from the scalar field. We have a similar equation $Mv = \lambda v$, but now $Mv$ is matrix multiplication between two compatibly-sized matrices, not applying an operator to an abstract vector to get another abstract vector.

Now, we have a way of transforming an operator $T$ on an $n$-dimensional space over a field $\Bbb{F}$ into an $n \times n$ matrix $M$, using a basis $B$. This $M$ has the special property that $[T(v)]_B = M[v]_B$ for all $v \in V$, where $[ \cdot ]_B$ is the coordinate vector with respect to $B$. We call this $M$, $[T]_B$, or perhaps $[T]^B_B$, or indeed any number of other notations. Note that it depends on the basis $B$.

It is not difficult to see that $$T(v) = \lambda v \iff [T]^B_B[v]_B = \lambda [v]_B.$$ This implies that, the eigenvalues of $T$ and $[T]_B^B$ are indeed the same. Note that this doesn't depend on the basis $B$ at all, so no matter what basis you choose, the eigenvalues will be perfectly preserved.

The eigenvectors, on the other hand, are not the same, but are still closely related. An abstract vector $v \in V$ is an eigenvector of the operator $T$ if and only if the column vector $[v]_B$ is an eigenvector of the matrix $[T]_B^B$. In this way, we see that the eigenspace of matrices $[T]_B^B$ and $[T]_E^E$ are actually potentially completely different. They represent the same set of arbitrary vectors in $V$, but they will produce totally different column vectors in $\Bbb{F}^{n \times 1}$.

Now, in our setting, it's a bit more complicated still. We are starting in one basis, and finishing in another. Our eigenvalue/vector equation, in terms of abstract vectors and operators, becomes $$[T]_B^E[v]_B = \lambda [v]_E$$ when we try to use the matrix $[T]_B^E$. But, note that the two column vectors $[v]_B$ and $[v]_E$ could be totally different. There is no reason to think that the eigenvalues are the same, and no reason to think that there will be any correspondence between the eigenvectors.

So, you are right that the operator $I_{\Bbb{R}^2}$ on the (not so) abstract vector space $\Bbb{R}^2$ has eigenvectors independent of basis. As per your question, we changed it into a matrix, though not one of the form $\left[I_{\Bbb{R}^2}\right]_B^B$. You asked about the matrix's eigenvalues and eigenvectors, which means now we are no longer considering abstract vectors in $\Bbb{R}^2$ and the abstract operator $I_{\Bbb{R}^2}$, but instead a $2 \times 2$ matrix of real numbers. The eigenvectors are going to be column vectors in the matrix space $\Bbb{R}^{2 \times 1}$. Because we mixed the bases, there's no reason to expect the eigenvalues to be the same, or the eigenvectors to correspond.

I hope you don't disagree with my calculation of $$\left[I_{\Bbb{R}^2}\right]_B^E = 2 I_{2 \times 2} = \begin{bmatrix} 2 & 0 \\ 0 & 2\end{bmatrix}.$$ The eigenvectors of this matrix will satisfy: $$\begin{bmatrix} 2 & 0 \\ 0 & 2\end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \lambda \begin{bmatrix} x \\ y \end{bmatrix}.$$ It's very easy to see that the only possible value of $\lambda$ is $2$, proving that this matrix (not operator) has eigenvalue $2$, which disagrees with the eigenvalues of the original operator on which it was based.