Gram-Schmidt Theorem for linearly dependent basis

80 Views Asked by At

From Shankar's QM book pg. 15 on Gram-Schmidt theorem:

Let $|I\rangle$, $|II\rangle$, ... be a linearly independent basis. The first vector of the orthonomal basis will be $$|1\rangle=\frac{|I\rangle}{\sqrt{\langle I|I\rangle}}.$$ For the second vector in the orthonormal basis, consider $$|2'\rangle=|II\rangle-|1\rangle\langle1 | II\rangle.$$ Divide $|2'\rangle$ by its norm to get $|2\rangle.$

For the third vector in the orthonormal basis, consider $$|3'\rangle = |III\rangle-|1\rangle\langle1|III\rangle-|2\rangle\langle2|III\rangle.$$ Divide $|3'\rangle$ by its norm to get $|3\rangle.$ Repeat the procedure to generate all the orthonormal basis vectors $|1\rangle$, $|2\rangle$ ,$|3\rangle$...

It was then said that if we use a linearly dependent basis $|I\rangle$, $|II\rangle$, ..., at some point a vector like $|2'\rangle$ or $|3'\rangle$ will become the zero vector $|0\rangle$, putting a stop to the whole procedure, i.e. Gram Schmidt won't work for linearly dependent basis.

While I verfied that the above statement is true for some examples of linearly dependent vectors, e.g. $(1,1,0)$, $(1,0,1)$ and $(3,2,1)$, how can it be shown that it is true for any set of linearly dependent vectors?

1

There are 1 best solutions below

1
On BEST ANSWER

Minor nitpick: A linearly dependent basis is not a basis.

A hint to solve the problem: In your linearly dependent list of vectors $\left\{\vec e_i\right\}$, with $i$ from 1 to some $M$, there will be a first vector, call it $\vec e_d$ that can be expressed as a linear combination of the previous ones,$$\vec e_d=\sum_{i=1}^{d-1} a_i \vec e_i\,.$$ (In other words, the set $\left\{\vec e_i\right\}_{1\leq i<d}$ is linearly independent, but $\left\{\vec e_i\right\}_{1\leq i\leq d}$ is not.) Can you work out what the Gram--Schmidt makes of $\vec e_d$?