Orthonormalizing a basis of eigenvectors

964 Views Asked by At

I'm trying to solve a problem involving orthogonal operators (the details are irrelevant). The (finite dimension) vector space in question has a basis $v_1...v_n$ of eigenvectors. Is it safe to say one can order the basis vectors according to their respective eigenspaces and then apply the Gram-Shcmidt process to obtain an orthonormal basis that still consists of eigenvectors?

1

There are 1 best solutions below

3
On

To use the Gram-Schmidt process you have the scalar product to be positive-definite. In alternative to this, you can always use the Lagrange's Algorithm.

Lagrange's Algorithm

Let $(V,\phi)$ be a vectorial space over $\mathbb{K}$ with a scalar product on it. Then, for $i=1,...,n$ repeat these steps:

If $v_i$ is not an isotropic vector then $\phi (v_i,v_i) \neq 0$ and you define: $$(v_i)' = v_i - \sum_{j>i} \frac{\phi(v_i,v_j)}{\phi(v_i,v_i)}v_j$$

The result is a vector which still form a basis with the other vectors and it's orthogonal to the others after it. In fact, $\phi(v_i',v_j) \neq 0$ with $j>i$. Then you put $v_i'$ instead of $v_i$.

If $v_i$ is an isotropic vector you exchange $v_i$ with $v_j$ with $j>i$. If all the vectors are isotropic then you search a non-isotropic vector between the $v_j+v_k$ vectors, with $j,k \geq i$. If even these are all isotropic, then the basis is already orthogonal and the algorithm stops.