Can we say that the matrix $\begin{bmatrix} A & A \\ 0 & A \end{bmatrix}$ is diagonalizable if and only if $A = 0$?

652 Views Asked by At

Consider the block matrix

$$\begin{bmatrix} A & A \\ 0 & A \end{bmatrix}$$

where $A$ is a $n\times n$ complex matrix.

Can we say that this matrix is diagonalizable if and only if $A = 0$?

4

There are 4 best solutions below

2
On

HINT: A matrix $B$ is diagonalizable if and only if $B$ is normal ($BB^*=B^*B$).

0
On

The answer to your question is yes. Of course if $A = 0$ then $M$ is diagonalizable, but the trick is to prove that the converse holds.

One way to see this is as follows: a matrix $M$ of size $2n$ is diagonalizable if and only if $(M - \lambda I)^{2n}v = 0$ implies that $Mv = \lambda v$, i.e. that $(M - \lambda I)v = 0$.

Let $M$ be your matrix. For a vector $v = (x,y)$, we find (using block-matrix multiplication) that $$ Mv = \pmatrix{A&A\\0&A}\pmatrix{x\\y} = \pmatrix{A(x+y)\\Ay} $$ Now, let $y$ be an eigenvector of $A$ associated with eigenvalue $\lambda$. We note that taking $v = (0,y)$, we have $$ Mv = \pmatrix{Ay\\Ay} = \lambda \pmatrix{y\\y} \implies (M - \lambda I)v = \lambda \pmatrix{y\\y} - \lambda \pmatrix{0\\y} = \pmatrix{\lambda y\\0},\\ (M - \lambda I)^2 \pmatrix{0\\y} = (M - \lambda I) \pmatrix{\lambda y\\0} = 0. $$ So, $A$ cannot be diagonalizable if $A$ has any non-zero eigenvalues.

Now, suppose that $A$ has only zero eigenvalues. It follows that $M$ has only zero eigenvalues. However, if a matrix with zero as its only eigenvalues is diagonalizable, it must be the zero matrix. So, if $A$ has only zero eigenvalues and $M$ is diagonalizable, then $M = 0$ which in turn means that $A$ is zero.

So, the conclusion holds: if $M$ is diagonalizable, then $A$ must be zero.


Another way to see this is to note that your matrix can be expressed as $$ M = \pmatrix{1&1\\0&1} \otimes A, $$ where $\otimes$ denotes the Kronecker product, and then to use the properties of the Kronecker product. We could also use the Kronecker product to simplify notation in the proof outlined above.

0
On

One direction is trivial. Suppose conversely that $$ A'=\begin{bmatrix} A & A \\ 0 & A \end{bmatrix} $$ is diagonalizable. With Laplace expansion, we see that the characteristic polynomial of $A'$ is the square of the characteristic polynomial of $A$. Hence $A'$ has the same eigenvalues as $A$, with doubled algebraic multiplicities.

Suppose $\lambda$ is an eigenvalue of $A$, of algebraic multiplicity $m$. Since $A'$ is diagonalizable, then the rank of $A'-\lambda I_{2n}$ is $2n-2m$.

Perform Gaussian elimination on $A-\lambda I_n$, to get a RREF $U$. Then we can do the same row operations on $A'$, first on the top part and then on the bottom part with suitable translation of the indices and get $$ \begin{bmatrix} U & B \\ 0 & U \end{bmatrix} $$ If $k$ is the rank of $A-\lambda I_n$, then this matrix has $k$ linearly independent columns among columns from $1$ to $n$ and at least $k$ linearly independent columns among columns from $n+1$ to $2n$. The combined $2k$ columns are still linearly independent. So the rank is at least $2k$.

Hence $2n-2m\ge 2k$, so $n-m\ge k$. Thus $m\le n-k$; but $n-k$ is the geometric multiplicity of $\lambda$ as an eigenvalue of $A$ and we know that $n-k\le m$. Thus $n-k=m$.

This holds for every eigenvalue, so also $A$ is diagonalizable. Suppose $D=S^{-1}AS$ is diagonal. Then it's a simple verification that $$ \begin{bmatrix} S & 0 \\ 0 & S \end{bmatrix}^{-1}= \begin{bmatrix} S^{-1} & 0 \\ 0 & S^{-1} \end{bmatrix} $$ and $$ \begin{bmatrix} S^{-1} & 0 \\ 0 & S^{-1} \end{bmatrix} \begin{bmatrix} A & A \\ 0 & A \end{bmatrix} \begin{bmatrix} S & 0 \\ 0 & S \end{bmatrix}= \begin{bmatrix} D & D \\ 0 & D \end{bmatrix} $$ Also this matrix is diagonalizable, because $A'$ is.

Let $\lambda$ be a nonzero eigenvalue of multiplicity $m$ of $A$. It is not restrictive to assume that $\lambda$ occupies places $(1,1),(2,2),\dots,(m,m)$ of the matrix $D$. It's simple to observe that $$ \begin{bmatrix} D & D \\ 0 & D \end{bmatrix}-\lambda I_{2n} $$ has rank $2n-m$, contradicting diagonalizability. Therefore the only eigenvalue of $A$ is zero, so $D=0$ and also $A=0$.

6
On

Let $m$ be the minimal polynomial of $A$ and $M=\begin{pmatrix}A&A\\0&A\end{pmatrix}$. Note that $m(M)=\begin{pmatrix}m(A)&Am'(A)\\0&m(A)\end{pmatrix}$.

Thus $m^2(M)=0$ and, since $M$ is diagonalizable, $m(M)=0$. Suppose that $m$ has multiple roots; then, there is a polynomial $p$ that strictly divides $m$ and satisfies $p(M)=0$; that implies $p(A)=0$, a contradiction.

Finally, $m$ has simple roots and $m(M)=0$ implies that $Am'(A)=0$; since $m'(A)$ is invertible, $A=0$. $\square$