A basic question on diagonalizability of a matrix

6.2k Views Asked by At

I am following a book where the "diagonalizability" has been introduced as follows:

Consider a basis formed by a linearly independent set of eigen vectors $\{v_1,v_2,\dots,v_n\}$. Then it is claimed that with respect to this basis, the matrix $A$ is diagonal.

I am confused at the word "basis" here. In some other books it is said that a matrix $A$ is called diagonalizable if there exists matrix $P$ such that $P^{-1}AP$ is a diagonal matrix.

I don't think that diagonalizability has anything to do with basis. It just happened that the set of eigen vectors form a basis of $\Bbb R^n$. Also I don't think having a set of $n$ linearly independent eigen vectors is a necessary condition for a matrix to be diagonalizable.

3

There are 3 best solutions below

2
On BEST ANSWER

A complex matrix $A\in\mathbb{C}^{n\times n}$ is diagonalizable iff its eigenvectors form a basis of $\mathbb{C}^n$ (note that even real matrices can have complex eigenvectors). That is there exists a nonsingular matrix $P$ such that $P^{-1}AP=D$, where $D$ is diagonal containing the eigenvalues of $A$ with corresponding eigenvectors being the columns of $P$. Also, $P^{-1}AP=D$ means that $AP=PD$, that is, $A$ acts in the basis formed by columns $P$ as diagonal matrix ($D$).

Some notes:

Not all matrices are diagonalizable. For example nontrivial Jordan blocks: $$ A = \begin{bmatrix} 1 & 1 & & & & \\ & 1 & 1 & & & \\ & & \ddots & \ddots & & \\ & & & 1 & 1 \\ & & & & 1 \end{bmatrix}\in\mathbb{R}^{n\times n} $$ has 1 eigenvalue of multiplicity $n$ but with only one (linearly independent) eigenvector.

A matrix with distinct eigenvalues is diagonalizable.

0
On

The two definitions are equivalent: the matrix $A$ is diagonalizable if there's an invertible matrix $P$ such that $$A=PDP^{-1}$$ where $D=\mathrm{diag}(\lambda_1,\cdots,\lambda_n)$ is a diagonal matrix but this means that $P$ is the change of base matrix so there's a basis $\mathcal B=(v_1,\ldots,v_n)$ in which $A$ is diagonal: $Dv_i=\lambda_i v_i$ hence $v_i$ is an eigenvector of $D$ associated to the eigenvalue $\lambda_i$ so $\mathcal B$ is a basis formed by a linearly independent set of eigenvectors.

0
On

The matrix of a linear operator $T$ on a basis $B=(b_1,\ldots,b_n)$ is diagonal if and only if the basis vectors $b_1,\ldots,b_n$ are all eigenvectors of$~T$. This is immediate from the definitions eigenvector and a diagonal matrix (just write out what applying $T$ ot $b_i$ does).

A linear operator$~T$ on$~V$ is diagonalisable if on some basis of$~V$ its matrix is diagonal. By the above, that means precisely that there exists a basis of$~V$ consisting of eigenvectors of$~T$, and on such a basis the matrix of$~T$ is diagonal. So unlike what you said in the last paragraph of the question, having $\dim V$ linearly independent eigenvectors is a necessary and sufficient condition for $T$ to be diagonalisable

If the matrix$~A$ of$~T$ is given on some (other) basis, then the condition that $T$ (and hence$~A$) is diagonalisable is that some basis transformation (namely one from the given basis to a basis of eigenvectors) will bring $A$ to diagonal form. A basis transformation is $A\mapsto P^{-1}AP$ for some invertible matrix$~P$ which gives your "other" definition of diagonalisable. It has the drawback of hiding the geometric meaning somewhat, and not being applicable to linear operators without first expressing them on a basis by a matrix.