Is there a necessary and sufficient condition for a square matrix to be able to diagonalize a symmetric square matrix?

746 Views Asked by At

Suppose I have a matrix

$A = \begin{bmatrix} l & l_m\\ l_m & l \end{bmatrix}$

if I want to diagonalize $A$ using $T$:

$\begin{bmatrix} a & 0\\ 0 & b \end{bmatrix} = T^{-1}AT$

is there a necessary and sufficient condition on $T$ that ensures the diagonalization?

To be of more help in answering my question (hopefully), I'd like to have a condition which can help me in discriminating visually (at a glance) between two matrices $B$ and $C$ which one of the two is able to do the diagonalization.

If it is not possible to do this at a glance, I'd like to find a faster way to evaluate whether $B$ or $C$ can diagonalize $A$ without calculating the whole matrix product $T^{-1}AT$. Is there a faster way of checking this?

I only need to do this for 2x2 matrices and I can assume $A$ is always symmetric and positive definite.

2

There are 2 best solutions below

2
On BEST ANSWER

$T$ diagonalizes A if and only if it is of the form $\bigg(\begin{matrix} u & v \\ \pm u & \mp v \end{matrix}\bigg)$, with $u \neq 0$ and $v \neq 0$.

Proof:

The matrix $A = \bigg(\begin{matrix} l & l_m \\ l_m & l \end{matrix}\bigg)$, has two eigenvalues:

  • $(l + l_m)$ for with eigenvector $\bigg(\begin{matrix} 1 \\ 1 \end{matrix}\bigg)$

  • $(l - l_m)$ for with eigenvector $\bigg(\begin{matrix} 1 \\ -1 \end{matrix}\bigg)$

Thus, if T diagonalizes A, it must be of the form $\bigg(\begin{matrix} u & v \\ \pm u & \mp v \end{matrix}\bigg)$, with $u \neq 0$ and $v \neq 0$.

Reciprocally, if T is of the form $\bigg(\begin{matrix} u & v \\ \pm u & \mp v \end{matrix}\bigg)$, with $u \neq 0$ and $v \neq 0$,

then $AT = \bigg(\begin{matrix} u.(l \pm l_m) & v.(l_m \mp l) \\ \pm u.(l \pm l_m) & \mp v.(l_m \mp l) \end{matrix}\bigg)$

and since $T\bigg(\begin{matrix} (l \pm l_m) & 0 \\ 0 & (l \mp l_m) \end{matrix}\bigg) = \bigg(\begin{matrix} u.(l \pm l_m) & v.(l_m \mp l) \\ \pm u.(l_m \pm l) & \mp v.(l \mp l_m) \end{matrix}\bigg)$

$AT = T\bigg(\begin{matrix} (l \pm l_m) & 0 \\ 0 & (l \mp l_m) \end{matrix}\bigg)$

$T$ diagonalizes $A$.

0
On

In fact a matrix of order $n\times n$ is diagonalizable if its set of eigenvectors span whole the $\Bbb R^n$. If the eigenvectors of A are $v_1,v_2$ then$$T=\begin{bmatrix}v_1 & v_2\end{bmatrix}$$and $$T^{-1}AT=\begin{bmatrix}\lambda_1 & 0\\0 & \lambda_2\end{bmatrix}$$