Why does Gram Schmidt preserve eigenvectorness?

869 Views Asked by At

Say you have 2 vectors in the same Eigenspace and apply GS to get two orthogonal vectors. Why are they still eigenvectors? (Assuming that the matrix in question is normal)

3

There are 3 best solutions below

0
On

The Gram-Schmidt procedure use two operations: multiplication of a vector by a scalar ad addition of vectors. So, starting from vectors in a vector space we obtains linear combinations of these vectors, and a linear combination of vectors is a vector in the same vector space.

0
On

The eigenspace is invariant under linear combinations of its vectors. That is what the Gram-Schmidt does to eigenvectors. It changes a basis to a diagonal basis with linear combinations.

0
On

Gram-Schmidt theorem tells us that for linearly independent $\{v_1,\ldots v_n\}$ in inner product space $V$ there exists orthonormal set $\{e_1,\ldots,e_n\}$ such that

$$\operatorname{span}\{v_1,\ldots, v_i\} = \operatorname{span}\{e_1,\ldots, e_i\},\ i = 1,\ldots, n.$$

Let $A$ be a linear operator on $V$ with eigenvalue $\lambda$. Let me denote $V_\lambda$ the set of all eigenvectors of $\lambda$, i.e. $V_\lambda = \{ v\in V\,\mid\, Av = \lambda v\}$. I claim that $V_\lambda$ is a subspace of $V$. To prove this, it is enough to check that $v,w\in V_\lambda \implies \alpha v + \beta w \in V_\lambda$ for all scalars $\alpha, \beta$:

$$A(\alpha v + \beta w) = \alpha Av + \beta Aw = \alpha \lambda v + \beta \lambda w = \lambda (\alpha v + \beta w),$$

so, linear combination of eigenvectors is again eigenvector, i.e. $V_\lambda$ is a subspace of $V$.

Now, if you choose $\{v,w\}$ linearly independent in some eigenspace $V_\lambda$ and apply Gram-Schmidt, you get orthonormal vectors $\{e,f\}$ and we have $$e,f\in\operatorname{span}\{e,f\} = \operatorname{span}\{v,w\}\subseteq V_\lambda$$ and thus, $e,f$ are again eigenvectors.