What are the necessary conditions for a matrix to have a complete set of orthogonal eigenvectors?

712 Views Asked by At

From my lecture notes, I learned that for a $N\times N$ real symmetric matrix $\mathbf{A}$, it is known that it has a complete set of $N$ orthogonal eigenvectors $\hat{e}^{k}$, with $k=1 \ldots N$ which can be normalised such that: $$ \forall k : \quad \boldsymbol{A} \hat{e}^{k}=\mu_{k} \hat{e}^{k}, \quad \forall k, k^{\prime} : \quad \hat{e}^{k} \cdot \hat{e}^{k^{\prime}}=\delta_{k k^{\prime}} $$

Where $\left\{\mu_{1}, \ldots, \mu_{N}\right\}$ are the $N$ eigenvalues of $\mathbf{A}$ and are not necessarily distinct.

However in my case I deal with a random real matrix $\mathbf{A}$, which is not symmetric. What are the necessary conditions to ensure that there are a complete set of $N$ orthogonal eigenvectors?

My aim is to use the $N$ eigenvectors as a new basis in $\mathbb{R}^N$ and represent any vector by: $$\vec{a}=\sum_{k=1}^{N} \sigma_{k}\hat{e}^{k}$$ for some coefficients $\left\{\sigma_{1}, \ldots, \sigma_{N}\right\}$. To this end I need to be sure that those eigenvectors form indeed a new basis. How can I be sure?

Thanks

2

There are 2 best solutions below

2
On BEST ANSWER

A complete set of $N$ orthogonal eigenvectors implies that $A$ is symmetric.

If $A$ has a complete set of $N$ eigenvectors, then it is diagonalizable. In other words, we may write $$ A = QDQ^{-1} $$ where $Q$ is a matrix with the eigenvectors of $A$ as columns, and $D$ is a diagonal matrix with the corresponding eigenvalues of $A$ (repeated as suitable) along the diagonal.

If the eigenvectors of $A$ are all pairwise orthogonal, and we normalize the eigenvectors so that they all have unit length, this makes $Q$ into an orthogonal matrix ($Q^{-1} = Q^T$). Then we have $$ A = QDQ^T $$ and we see that the right-hand side is symmetric. Therefore $A$ must also be symmetric.

0
On

$\mathbf{A}$ must be normal. That is, $\mathbf{A}^* \mathbf{A} = \mathbf{A}\mathbf{A}^*$, Where $\mathbf{A}^*$ is the adjoint of $\mathbf{A}$. Moreover, $\mathbf{A}$ being normal is a sufficient condition.

A matrix $\mathbf{A}$ is normal if and only if it is unitarily similar to a diagonal matrix. That is, there is a unitary matrix $\mathbf{U}$ such that $\mathbf{U}^* \mathbf{A}\mathbf{U}$ is diagonal. (This is the so-called "spectral theorem".)

A unitary matrix is one that has orthogonal columns, in the sense $\mathbf{U}^*\mathbf{U} = \mathbf{I}$. So normality is a sufficient condition for a complete set of eigenvectors.

Conversely, if a square matrix has a complete set of eigenvectors, apply the usual successive orthonormalization to the set of eigenvectors, and construct the unitary matrix $\mathbf{U}$ with the resulting set of orthnormal vectors as columns. Then $\mathbf{U}^* \mathbf{A} \mathbf{U}$ is diagonal, and so, $\mathbf{A}$ is normal.

examples

The 2D rotation through $\pi / 2$ CCW is $$ \left[ \begin{matrix} 0 & -1 \\ 1 & 0 \end{matrix} \right] $$ it has eigenvalues $\pm i$, corresponding to eigenvectors $x_1 = (1, -i)$, $x_2 = (1, i )$, which are orthogonal in the sense $x_1 \cdot \overline{x_2} = 0$.

Matrices which fail to have a complete set of eigenvectors are those with a factor that contains a shear.
The simplest example is $$ \left[ \begin{matrix} 1 & 1 \\ 0 & 1 \end{matrix} \right] $$ which has only 1 as an eigenvalue, with only $(1, 0)$ as an eigenvector. (Its square provides a second "generalized" eigenvector for 1.)

Note that the rotation matrix above is normal; the shear matrix is not. There are several useful classes of matrices that are normal.

Besides the rotation matrices, symmetric matrices are one such class. Symmetric real matrices have real eigenvalues, which answers a special case of the original question.

Concerning your aim: you seem to be operating under the misapprehension that real matrices have real eigenvalues, or perhaps that useful real matrices have real eigenvalues.

Generally, eigenanalysis is a useful tool, one of the first ways one learns to analyze a matrix, but there are many other ways to analyze a matrix. You might have a look at advanced texts by, for example, Golub & van Loan, by Stewart, or by Shilov.