Diagonalizability of $f$ depends on $f \circ f$

651 Views Asked by At

Let $V$ be a finite-dimensional vector space over $\mathbb{C}$ and let $f:V\rightarrow V$ be an invertible linear transformation. Prove that $f$ is diagonalizable if and only if $f^2$ (meaning $f\circ f$) is diagonalizable.

I tried considering the characteristic polynomial of the transformation since $f$ will be triagonalizable (as every endomorphism over $C$) meaning it's eigenvalues are just the elements of the diagonal, but I'm not sure if that helps...

3

There are 3 best solutions below

3
On BEST ANSWER

Hint: if $AfA^{-1}$ is diagonal, then $Af^2A^{-1}=AfA^{-1}AfA^{-1}$ is a product of two diagonal elements, and therefore also diagonal.

The other direction is similar, but you have to use the fact that $f$ is invertible.

3
On

Here's one possibility:

A transformation $g$ is diagonalizable if and only if its minimal polynomial is square free (since we are working over $\mathbb{C}$, it certainly splits). If $g$ is invertible, we also know that $g$ does not have $0$ as an eigenvalue.

$\Rightarrow)$ If $f$ is diagonalizable, then there is a basis $\beta$ such that ${}[f]_{\beta}^{\beta}$ is diagonal; then ${}[f^2]_{\beta}^{\beta} = ([f]_{\beta}^{\beta})^2$ is the square of a diagonal matrix, hence diagonal.

$\Leftarrow)$. If $f^2$ is diagonalizable, then the minimal polynomial of $f^2$ has no repeated roots, $$m_{f^2}(t) = (t-\alpha_1)\cdots(t-\alpha_k)$$ with $\alpha_1,\ldots,\alpha_k$ pairwise distinct nonzero complex numbers. That means that $$(f^2-\alpha_1I)\cdots(f^2-\alpha_kI)=0. \qquad(1)$$ Let $\beta_{i1}$ and $\beta_{i2}$ be the complex square roots of $\alpha_i$; they are distinct, since $\alpha_i\neq 0$. By (1), $f$ satisfies the polynomial $$(t^2-\alpha_1I)\cdots(t^2-\alpha_kI) = (t-\beta_{11})(t-\beta_{12})\cdots (t-\beta_{k1})(t-\beta_{k2})\qquad(2)$$ Thus, the minimal polynomial of $f$ divides the polynomial (2), which is squarefree. Hence the minimal polynomial of $f$ is squarefree, and so $f$ is diagonalizable, as claimed.

0
On

Since $f$ is a linear transformation and $V$ is finite-dimensional, this problem is equivalent to the following matrix-theoretic problem.

Let $A \in \mathsf{M}_n(\mathbb{C})$ be invertible. Prove that $A$ is diagonalizable if and only if $A^2$ is diagonalizable.

If $A$ is diagonalizable, then there is a diagonal matrix $D$ and an invertible matrix $S$ such that $A = SDS^{-1}$. Moreover, $A^2 = SD^2S^{-1}$ is diagonalizable since $D^2$ is a diagonal matrix, which demonstrates the necessity of the statement above. (Notice that that the assumption of invertibility is unnecessary for this direction.)

The converse is a little trickier, but it is a consequence of the following result, from Higham's Functions of Matrices:

enter image description here

Essentially, if $\lambda$ is an eigenvalue of $A$, then any Jordan block with eigenvalue $\lambda$ will not "split" if $f'(\lambda) \ne 0$. Here is where the assumption of invertibility is crucial: in the theorem above, $f(z) = z^2$ and $f'(z) = 2z$. Thus, $f'(z) = 0$ if and only if $z = 0$. For contradiction, suppose that $A^2$ is diagonalizable and $A$ is not. Then $A$ has a nontrivial Jordan block. But Theorem 1.36 above requires that $A^2$ has a Jordan block of the same size, which is a contradiction.