Let $T$ be the linear operator 'left multiplication by $A$' on $\textbf{M}_{n\times n}$. Is it true that $A$ and $T$ have the same eigenvalues?

421 Views Asked by At

Let $V$ be the space of $n\times n$ matrices over $\textbf{F}$. Let $A$ be a fixed $n\times n$ matrix over $\textbf{F}$. Let $T$ be the linear operator 'left multiplication by $A$' on $V$. Is it true that $A$ and $T$ have the same eigenvalues?

MY ATTEMPT (EDIT)

According to the problem setting, $T(X) = AX$, where $X\in V$. Thus the eigenvalues of $T$ are those scalars $\alpha$ that satisfy the equation $T(X) = \alpha X$ such that $X\neq 0$, i.e., $AX = \alpha X = \alpha IX$ and the operator $A - \alpha I$ is singular.

On the other hand, the eigenvalues of $A$ are those scalars $\alpha\in\textbf{F}$ such that $Ax = \alpha x = \alpha Ix$ and $x\neq 0$, that is to say, $A - \alpha I$ is singular.

Since both restrictions are the same, we conclude that $A$ and $T$ have the same eigenvalues.

Any comments on my solution?

1

There are 1 best solutions below

3
On BEST ANSWER

It's pretty good, but there is a small hole in your reasoning. The scheme of your proof is that you're trying to show that the predicate $P(\lambda)$ ("$\lambda$ is an eigenvalue of $T$") is equivalent to the predicate $R(\lambda)$ ("$\lambda$ is an eigenvalue of $A$"). You're showing this by making a new predicate $Q(\lambda)$ ("$A - \lambda I$ is singular"), then trying to show that $P(\lambda) \iff Q(\lambda)$ and $Q(\lambda) \iff R(\lambda)$.

This is certainly a valid approach to solving the question, but you haven't completely proven the two equivalences. Your paragraph

According to the problem setting, $T(X)=AX$, where $X \in V$. Thus the eigenvalues of $T$ are those scalars $\alpha$ that satisfy the equation $T(X)= \alpha X$ such that $X \neq 0$, i.e., $AX= \alpha X= \alpha IX$ and the operator $A− \alpha I$ is singular.

proves that $P(\lambda) \implies Q(\lambda)$, but does not seem to address $Q(\lambda) \implies P(\lambda)$. As such, your proof really only shows $P(\lambda) \implies R(\lambda)$, not the converse.

Think about this: how can we go from $Q(\lambda)$ to $P(\lambda)$? Or, indeed, $R(\lambda)$ to $P(\lambda)$ (which is what I would personally do)? How can you begin with an eigenvector of $A$ corresponding to $\lambda$, which is a column vector, and find an entire $n \times n$ eigenmatrix? There are a few ways to do it, and I'll let you think about how.