I'm interested to know if it's true because I saw that if $A^2$ is diagonalizable then $A$ is not necessary Diagonalizable.
I have a feeling it's true but I'm not sure how to prove it.
This is my proof but I'm not sure if it's good to assume that :
My try
By Diagonalizable we know that exist invertible matrix $P$ such that :
$P^{-1}AAP=diag(\lambda_1,...,\lambda_n)$
$AA=Pdiag(\lambda_1,...,\lambda_n)P^{-1}$
Set $D=Pdiag(\sqrt{\lambda_1},...,\sqrt{\lambda_n})P^{-1}$
So : $D^2=(Pdiag(\sqrt{\lambda_1},...,\sqrt{\lambda_n})P^{-1})(Pdiag(\sqrt{\lambda_1},...,\sqrt{\lambda_n})P^{-1})=Pdiag(\lambda_1,...,\lambda_n)P^{-1}=AA$
Here I wonder - $A=D=Pdiag(\lambda_1,...,\lambda_n)P^{-1} \rightarrow A$ is diagonalizable.
$A \in M_n(\mathbb{C})$ invertible and $A^2$ is diagonalizable. Prove $A$ is diagonalizable
221 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
a simple approach is to argue by contradiction. Suppose $A$ is invertible and defective but $A^r$ is diagonalizable for some natural number $r$.
This means there is some non-zero eigenvalue $\lambda_1$ that has algebraic multiplicity $d$ strictly greater than geometric multiplicity. Via re-scaling $A$ we can assume WLOG that $\lambda_1=1$. (i.e. re-run the argument on $A':=\lambda_1^{-1}\cdot A$ if you prefer.)
via any triangularization (Jordan, Schur, or others) we have
$S^{-1} A S = \displaystyle \left[\begin{matrix}I_d + N & * & \\ \mathbf 0 & *\end{matrix}\right]$
$S^{-1} A^r S = \displaystyle \left[\begin{matrix}(I_d + N)^r & * & \\ \mathbf 0 & *\end{matrix}\right]=\displaystyle \left[\begin{matrix} I_d & * & \\ \mathbf 0 & *\end{matrix}\right]$
where $N\neq \mathbf 0$ is strictly upper triangular.
Thus
$I_d = (I_d + N)^r = I_d + \binom{r}{1}N^1+ \binom{r}{2}N^2+....+ \binom{r}{r-1}N^{r-1}+ \binom{r}{r}N^{r}$
$\implies \mathbf 0 = \binom{r}{1}N^1+ \binom{r}{2}N^2+....+ \binom{r}{r}N^{r}$
but nilpotent $N\neq \mathbf 0$ and its powers are linearly dependent, a contradiction.
Here is a relatively quick proof. Note that a matrix is diagonalizable if and only if its minimal polynomial is a product of distinct linear factors. Because $A^2$ is diagonalizable, its minimal polynomial can be written as $$ p(x) = (x - \lambda_1) \cdots (x - \lambda_n) $$ where the eigenvalues $\lambda_i \in \Bbb C$ are distinct. Notably, each $\lambda_j$ is non-zero because $A$ is invertible. Thus, $A$ satisfies $q(x) = 0$, where $$ q(x) = p(x^2) = (x^2 - \lambda_1) \cdots (x^2 - \lambda_n) \\ = (x - \sqrt{\lambda_1})(x + \sqrt{\lambda_1}) \cdots (x - \sqrt{\lambda_n})(x + \sqrt{\lambda_n}). $$ The polynomial $q(x)$ has no repeating factors. The minimal polynomial of $A$ must divide $q(x)$, so the minimal polynomial of $A$ cannot have any repeating factors. Thus, $A$ must be diagonalizable.