Gram Schmidt Process for finding orthonormal vector.

327 Views Asked by At

I was just clearing my doubts on eigenvector and eigenvalues, but in one of the book, the eigenvector of degenerate eigenvalues had eigenvector to be linearly independent. And they made it to orthonormal eigenvector by Gram Schmidt process.

My question is

  1. For any matrix with distinct eigenvalues are linearly independent eigenvectors. Then can we take those eigenvector and make it to be orthonormal by this process.

  2. If that so then why we prefer or-tho normal basis to expand an vector just so that it can easily find us the constant associated with each eigenvector, because linearly independent eigenvector will do the work if we made them to normalize it use this process.

Edit- I know it will be absurd for large dimension, but am i thinking it in right way.

2

There are 2 best solutions below

8
On

The process in general will change the property of being eigenvectors. However, we have this result: a real matrix has an orthonormal basis of eigenvectors if and only if it is a symmetric matrix.

1
On

Let me first introduce some terminology and notations. Let $A\in \mathbb{C}^{n\times n}$ be a square matrix. Consider for each $\lambda\in \mathbb{C}$ the space $E_{\lambda}=\{v\mid Av=\lambda v\}=\{v\mid (A-\lambda I)v=0\}$. If $\lambda$ is an eigenvalue, then $E_{\lambda}$ is called the corresponding eigenspace. The set of eigenvalues is denoted by $\sigma(A)$.

Assume now that $A$ is diagonalizable (there is a basis of eigenvectors). Equivalently, assume that $\mathbb{C}^n=\bigoplus_{\lambda\in \sigma(A)} E_{\lambda}$.

For each $\lambda\in \sigma(A)$, let $\{v_{\lambda,1}, \dots , v_{\lambda, d_{\lambda}}\}$ be a basis of $E_{\lambda}$ (so $d_\lambda=\dim(E_{\lambda}))$. Applying Gramm-Schmidt to each of these bases yields new orthonormal bases $\{v_{\lambda, 1}', \dots , v_{\lambda, d_{\lambda}}'\}$ for each $E_{\lambda}$, however, there is no reason why $E_{\lambda}$ should be orthogonal to $E_{\mu}$ for $\lambda\neq \mu$ and $\lambda,\mu\in \sigma(A)$.

If, for example, $A$ is a symmetric real matrix, then different eigenspaces are automatically orthogonal to each other. In that case, all you have to do to find an orthonormal basis of eigenvectors is simply applying the Gramm-Schmidt procedure to any basis of eigenvectors.

On the other hand, there are definitely matrices such that different eigenspaces are simply not orthogonal to each other. In that case, no orthonormal basis of eigenvectors can exist.

EDIT: Assume that $A\in \mathbb{R}^{n\times n}$ is diagonalizable. Let $\{v_1, \dots, v_n\}$ be a basis of eigenvectors such that $Av_i=\lambda_iv_i$ (I do not assume the $\lambda_i$'s to be distinct).

If your reasoning is correct, then applying the Gramm-Schmidt procedure would yield an orthonormal basis of eigenvectors $\beta=\{w_1, \dots ,w_n\}$ of $A$. Let $P$ be the matrix whose $i$'th column is precisely $w_i$. Then $PP^T=Id$ as $P$ is an orthogonal matrix. Furthermore, $P$ acts as the base change matrix $[Id]_{\beta}^{\text{st}}$ where $\text{st}$ is the standard basis. Hence $$A=PDP^{-1}=PDP^{T}$$ where $D$ is the diagonal matrix having the corresponding eigenvalues on it's diagonal. Now, note that \begin{eqnarray} A^T &=& (PDP^T)^T\\ &=& PD^TP^T\\ &=& PDP^T\\ &=& A. \end{eqnarray} Thus, $A=A^T$. This shows that if your reasoning is correct, then $A$ must be symmetric!