Is $A$ the $2 × 2$ identity matrix?

522 Views Asked by At

If $A$ is a $2 × 2$ complex matrix that is invertible and diagonalizable, and such that $A$ and $A^2$ have the same characteristic polynomial, then $A$ is the $2 × 2$ identity matrix.

My claim: Eigenvalues of $A^2$ are square of eigenvalue of $A$

$$\lambda=\lambda^2$$Since invertible $\lambda=1$ hence similar to identity matrix.

But only matrix similar to identity is identity itself.

But answer is given as FALSE. please explain me why i'm wrong

5

There are 5 best solutions below

0
On BEST ANSWER

Your proof would be correct if $A$ had only one eigenvalue. But if it has two distinct eigenvalues $\alpha$ and $\beta$, it can happen that $\{\alpha^2,\beta^2\}=\{\alpha,\beta\}$; all you need is to have $\alpha^2=\beta$ and $\beta^2=\alpha$. And therefore what you need is a number $\alpha$ such that $\alpha^4=\alpha$ and to take $\beta=\alpha^2$. And, of course, you don't want to have $\alpha=0$. So, $\alpha^4=\alpha\iff\alpha^3=1$. And, since you don't want to have $\alpha=1$, $\alpha^3=1\iff\alpha^2+\alpha+1=0$. So, take $A=\left[\begin{smallmatrix}\alpha&0\\0&\alpha^2\end{smallmatrix} \right]$, where $\alpha\in\mathbb C$ is such that $\alpha^2+\alpha+1=0$.

0
On

Counterexample: $$A=\begin{pmatrix}e^{\frac{2}{3}\pi i} & 0\\ 0 & e^{\frac{4}{3}\pi i}\end{pmatrix}$$ and $$A^{2}=\begin{pmatrix}e^{\frac{4}{3}\pi i} & 0\\ 0 & e^{\frac{2}{3}\pi i}\end{pmatrix}.$$

0
On

You have that $\sigma(A) = \sigma(A^2)$ where $\sigma$ denotes the spectrum, i.e. the set of eigenvalues.

For $\lambda \in \sigma(A)$ you noticed that we have $\lambda^2 \in \sigma(A^2) = \sigma(A)$ so there exists $\mu \in \sigma(A)$ such that $\lambda^2 = \mu$. It is not necessarily $\mu =\lambda$, so it doesn't follow $\lambda = \lambda^2$.

0
On

You make some claims, but it is not clear why those should be true. Let's see what we actually have to work with:

Since $A$ is invertible and diagonalizable, there exists an invertible $2\times2$-matrix $P$ such that $PAP^{-1}=D$, where $D$ is diagonal with nonzero entries, say $$D=\begin{pmatrix}\lambda_1&0\\0&\lambda_2\end{pmatrix}.$$ Then it is clear that the characteristic polynomial of $D$, and hence of $A$, equals $(X-\lambda_1)(X-\lambda_2)$.

Since $PA^2P=D^2$, the characteristic polynomial of $D^2$, and hence of $A^2$, equals $(X-\lambda_1^2)(X-\lambda_2^2)$.

Given that the characteristic polynomials of $A$ and $A^2$ are the same, we have $\{\lambda_1,\lambda_2\}=\{\lambda_1^2,\lambda_2^2\}.$

Since the $\lambda_i$ are nonzero, this means that either $\lambda_1=\lambda_1^2$ and $\lambda_2=\lambda_2^2$, in which case $D=A=I$, or $$\lambda_1=\lambda_2^2\qquad\text{ and }\qquad\lambda_2=\lambda_1^2,$$ in which case $\lambda_1^4=\lambda_1$ and $\lambda_2^4=\lambda_2$, meaning that $\lambda_1$ and $\lambda_2$ are third roots of unity.

0
On

If $\alpha$ and $\beta$ are eigenvalues of $A$ then $\alpha^2$ and $\beta^2$ are eigenvalues of $A^2$. From the equality of their characteristic polynomials, you have

$tr(A)=tr(A^2)\implies \alpha+\beta=\alpha^2+\beta^2$ ..(1)

and $det(A)=det(A^2)\implies \alpha\beta=\alpha^2\beta^2\implies \alpha\beta=1$ (as $\alpha\beta\ne0$).

This gives you $\alpha=1/\beta$. Plugging this in (1) gives

$\alpha+1/\alpha=\alpha^2+1/\alpha^2$

$\implies \alpha^4-\alpha^3-\alpha+1=0$

$\implies(\alpha-1)(\alpha^3-1)=0$

$\implies \alpha=1$ or $\alpha^3=1$

So any complex number $\alpha=1$ or $\omega$ or $\omega^2$ can be eigenvalue of $A$.

i.e. $$A=\begin{pmatrix}\omega & 0\\ 0 & \omega^2\end{pmatrix}$$.