Diagonalization of a block matrix

1k Views Asked by At

I've just started learning about eigenvectors, eigenvalues and similar matrices, so I apologize if this question is simple.

I have a nxn matrix M, which is diagonalizable. I have to show that the following matrix is also diagonalizable:

\begin{bmatrix}M&-M\\-M&M\end{bmatrix}

Since M is diagonalizable, I can write it as M=SDS⁻¹. Therefore the matrix above can be written as:

\begin{bmatrix}SDS⁻¹&-SDS⁻¹\\-SDS⁻¹&SDS⁻¹\end{bmatrix}

I know I can write this as a product of 3 matrices, where the first has P on the main diagonal and 0 everywhere else. The 3rd matrix is the inverse of the first. In the middle I have: \begin{bmatrix}D&-D\\-D&D\end{bmatrix} But this isn't a diagonal matrix. I know it is similar to the matrix I had in the beginning. From what I understand, the diagonal elements of each block are the eigenvalues of M and the columns of S are its eigenvectors. Since M is diagonalizable, it has n linearly independant eigenvectors. I'm not exactly sure how to use this information.

If I continue with this approach, I believe I only have to find a diagonalization for the matrix I listed last.

The matrix is symmetric and I could prove that every symmetric matrix is diagonalizable, but I haven't found a satisfactory proof of that.

2

There are 2 best solutions below

0
On

You can prove that a symmetric matrix is diagonalizable using the spectral theorem. But you can answer your question also direcly.

If you prove that there exists a basis of $k^{2n}$ formed by eigenvectors for the matrix $$ A:=\begin{bmatrix} M & -M \\ -M & M \end{bmatrix} $$ then you are done. You know that $M$ is diagonalizable, so there exists a basis $\{v_1,\dots,v_n\}$ of $k^n$ made by eigenvectros of $M$. Now consider the vectros in $k^{2n}$ $$ \{w_1,\dots,w_n,u_1,\dots,u_n\}, $$ where, for $i=1,\dots,n$, $$ w_i:=\begin{pmatrix} v_i\\ v_i \end{pmatrix},\, \, \, u_i:=\begin{pmatrix} v_i\\0\end{pmatrix}. $$ (With this notation, I am meaning that in coordinates you write the coordinates of $v_i$ and then the coordinates of $v_i$ for the $w_i$'s and the coordinates of $v_i$ and then $n$ zeros for the $u_i$'s).

Because the $v_i$'s are eigenvectors for the matrix $M$ it is straigtforward, by computation, to see that the vectors $w_1,\dots,w_n,u_1,\dots,u_n$ are eigenvectors for the matrix $A$.

If we show that the $2n$ vectors $\{w_1,\dots,w_n,u_1,\dots,u_n\}$ are liearly independent we will be done. Suppose to have a linear combination $$ \lambda_1 w_1+\dots+\lambda_n w_n+\mu_1 u_1+\dots+\mu_n u_n=0, $$ that is, with the previous notation $$ \lambda_1 \begin{pmatrix} v_1\\ v_1 \end{pmatrix}+\dots+\lambda_n \begin{pmatrix} v_n\\ v_n \end{pmatrix}+\mu_1 \begin{pmatrix} v_1\\0\end{pmatrix}+\dots+\mu_n \begin{pmatrix} v_n\\0\end{pmatrix}=0. $$ Because the $v_i$'s are lieanly independent (as vectors of $k^n$) we find $\lambda_1=\dots=\lambda_n=0$, that is $$ \mu_1 \begin{pmatrix} v_1\\0\end{pmatrix}+\dots+\mu_n \begin{pmatrix} v_n\\0\end{pmatrix}=0, $$ and again using the linear independence of the $v_i$'s $$ \mu_1=\dots=\mu_n=0. $$

0
On

If $v$ is an eigenvector of $A$ with associated eigenvalue $\lambda$, then

$$\begin{bmatrix} M & -M \\ -M & M \end{bmatrix} \begin{bmatrix} v \\ -v \end{bmatrix} = 2 \lambda \begin{bmatrix} v \\ -v \end{bmatrix}$$

In other words, $\begin{bmatrix} v \\ -v \end{bmatrix}$ is an eigenvector of the block matrix with associated eigenvalue $2 \lambda$. This gives us an $n$-dimensional subspace of eigenvectors.

Now notice that for any $v \in \mathbb{R}^n$, we have

$$\begin{bmatrix} M & -M \\ -M & M \end{bmatrix} \begin{bmatrix} v \\ v \end{bmatrix} = 0$$

Which gives us an $n$-dimensional subspace of eigenvectors associated to the eigenvalue $0$. Clearly, such vectors are linearly independent to the previous $n$ eigenvectors. In all, that gives us an eigenbasis for $\mathbb{R}^{2n}$, and we're done.