$A$ , an $n\times n$ matrix has $k$ distinct eigenvalues, then the matrix $A^m$ for some $m>=0$ has the same eigenvalues raised to $m$

83 Views Asked by At

I can prove that if $A$ has eigenvalues $a_1, a_2, .., a_k$, then $A^m$ has eigenvalues $a_1^m, a_2^m, ..., a_k^m$. How do I prove that these are the only eigenvalues $A^m$ has?

Suppose $A^m$$x = a_{k+1}x$ for some $x$. Then why is $a_{k+1}$ a power of one of the eigenvalues of $A$?

Thanks.

Edit : The field we are working over can be assumed to algebraically complete. (Or the complex plane, I don't want to focus on this too much)

Edit 2 : No knowledge of the Jordan form, sorry.

3

There are 3 best solutions below

5
On BEST ANSWER

Hint. Over $\mathbb{R}$ or $\mathbb{C}$, you can write your matrix in its Jordan form. From the decomposition of $A$ in its Jordan from, the eigen values of $A^k$ become obvious.

Edit.

The Jordan form states (though I suggest a more detailed reading that this poor summary) that every matrix is similar to :

$\left(\begin{array}{ccc} \alpha_{1} & \ldots & a_{1}^{n}\\ \vdots & & \vdots\\ 0 & \ldots & \alpha_{n} \end{array}\right)$

Where the $\alpha_i$ are its eigen values, the lower part is 0 and the upper part may be non zero. Now if you raise this matrix to a power $m$, you obtain the eigenvalues of $A^m$.

Now $A^m$ is similar to:

$\left(\begin{array}{ccc} \alpha_{1}^m & \ldots & b_{1}^{n}\\ \vdots & & \vdots\\ 0 & \ldots & \alpha_{n}^m \end{array}\right)$

And :

$$\det(A^m-\lambda I)=(\alpha_1^m-\lambda)...(\alpha_n^m-\lambda)$$

5
On

This not true if the field is not algebraically closed, take for example $\mathbb{R}$, $A=\pmatrix{0&-1\cr 1&0}$ it does not has a real eigenvalue, but $A^2=-I$. If you suppose that the field is algebraically closed, you can find an invertible matrix $P$ such that $PAP^{-1}$ is an upper triangular matrix, then $(PAP^{-1}), ^p=PA^pP^{-1}$.

6
On

A simple proof using more or less nothing, for matrices over an algebraically closed field:

Say $\sigma(A)$ is the set of eigenvalues of $A$. Then in fact for any polynomial $p$ we have $\sigma(p(A))=p(\sigma(A))$.

Proof: Suppose for notational convenience that $p$ is monic. Given a scalar $\lambda$ we can factor the polynomial $p(x)-\lambda$: $$p(x)-\lambda=\prod_{j=1}^n(x-\omega_j).$$So $$p(A)-\lambda I=\prod_1^n(A-\omega_jI).\quad(*)$$

Lemma: (For square matrices over any field.) $AB$ is invertible if and only if $A$ and $B$ are both invertible.

Proof: If $A$ is not invertible then $A$ is not surjective (or rather of course the linear transformation defined by $A$ is not surjective), so $AB$ is not surjective. If $B$ is not invertible then $B$ is not injective, so $AB$ is not injective.

Corolllary: $\prod_1^n A_j$ is invertible if and only if each $A_j$ is invertible.

Now if you apply the Corollary to (*) it shows that $\lambda\in\sigma(p(A))$ if and only if there exists $j$ so that $\omega_j\in\sigma(A)$. But $$\{\omega_1,\dots,\omega_n\}=\{\omega:p(\omega)=\lambda\},$$so we've shown that $\lambda\in\sigma(p(A))$ if and only if there exists $\omega\in\sigma(A)$ with $\lambda=p(\omega)$. Which is exactly what it means to say $\sigma(p(A))=p(\sigma(A))$.