Proof of non diagonalizibility of higher matrix power

61 Views Asked by At

So the proof is to show that if a regular matrix A is not diagonalizable in $ M_n(\Bbb C) $ then no power of $A^k$ for $k \in \Bbb N$ so i started it with a proof by contradiction suppose $A^k$ is diagonalizable then there exists a minimal polynomial such that $p(A^k)=0$ so if we write that polynomial as $p=(x-a_1)...(x-a_n)$ but then i got stumped on how taking $k$-th roots will lead to a solution so when i checked the answer it said take the $k$-th root of all the $a_i$'s and we get a new polynomial $q=(x-a_{11})(x-a_{12})....(x-a_{1k}).....(x-a_{nk})$ where $a_{11},..,a_{1k}$ are roots of the original and for this polynomial $q(A)=0$ i got to this myself but i dismissed the correct way to a proof .

  1. Why can we factorize $p$ to $q$ like so cause if i multiply it i dont see how it turns back to p?
  2. what will guarantee that $A$ will be its root? i dont get how we cant have a remainder when we multiply everyting eg i know we will get $x^k -a_1$ cause i think ($a_{11}*...*a_{1k}=a_1$) but what will happen to the rest of it?

please give me a detailed explanation or at least a link to where i can read in detail on factorin or such things i have a knowledge gap in

edit: i forgot to state matrix A is regular fixed it

2

There are 2 best solutions below

0
On BEST ANSWER

On the contrary, we will assume that $A^k$ is diagonalizable for some integer $k \geq 2.$ We have therefore that the minimal polynomial $p(x)$ of $A^k$ can be written as a product of distinct linear factors, i.e., $p(x) = (x - c_1) \cdots (x - c_n)$ with $c_i \neq c_j$ for all pairs of integers $i \neq j.$ Using the fact that $p(A^k) = 0$ gives that $0 = p(A^k) = (A^k - c_1 I) \cdots (A^k - c_n I)$ so that $A$ satisfies the polynomial $q(x) = (x^k - c_1) \cdots (x^k - c_n).$ Considering that the underlying field is $\mathbb C,$ each of the factors $x^k - c_i$ splits into distinct linear factors, hence the polynomial $q(x)$ splits into distinct linear factors. But the minimal polynomial of $A$ must divide $q(x),$ hence the minimal polynomial of $A$ splits into distinct linear factors, i.e., $A$ is diagonalizable --- a contradiction. QED.

9
On

If $A$ is an $n\times n$ matrix that is not diagonalizable, then the genuine eigenvectors of $A$ do not span the space. If we look at eigenvectors of $A$ they take the following form:

$$Av_i=\lambda_jv_i$$ For some $i,j\in{1,2,...,m}$ where $m\leq n$. Let's prove that $A^k$ has the same eigenvectors as $A$ with eigenvalues equal to $\lambda_j^k$ for $k\geq 2$ by induction. Base case: $$A^2v_i=\lambda_j^2v_i$$ Now assuming the $kth$ case prove the $k+1th$ case. So we have: $$A^kv_i=\lambda_j^kv_i$$ Applying A from the left to both sides: $$A^{k+1}v_i=\lambda_j^{k+1}v_i$$ Thus we have shown that $A^k$ has the same eigenvectors as $A$. However, if $A^k$ has the same eigenvectors as $A$ that means that the eigenvectors of $A^k$ do not span the space, and thus $A^k$ is not diagnolizable.