a, b and c are 3 Independent Vectors. We can generate Orthonormal basis vectors using those 3 vectors using Gram-Schmidt method. Lets say those 3 orthogonal basis vectors generated from a, b and c are $A$, $B$ and $C$. We can generate orthonormal basis vectors by dividing those orthogonal basis vectors with their length.
I am trying to figure out if $C$ will be orthogonal to b (which is one of the original basis vector). It seems like it should be, because $C$ is orthogonal to the plane containing a and b. Is there any way to prove this? I am trying to use the approach below and going into calculations which don't look pretty. $$C = c - \frac{(A^Tc)A}{A^TA} - \frac{(B^Tc)B}{B^TB}$$ Multiplying both sides by $b^T$. Idea is to show $b^TC = 0$ to demonstrate orthogonality
$$b^TC = b^Tc - \frac{(A^Tc)b^TA}{A^TA} - \frac{(B^Tc)b^TB}{B^TB}$$
Using $$B = b - \frac{(A^Tb)A}{A^TA}$$
$$b^TC = b^Tc - \frac{(A^Tc)b^TA}{A^TA} - \frac{\left(b - \frac{(A^Tb)A}{A^TA}\right)^Tcb^T\left(b - \frac{(A^Tb)A}{A^TA}\right)}{B^TB}$$
I am not sure if I need to follow through on this or am I headed in the wrong direction? Thank you.
Yes, this is indeed the case. If we have a sequence of vectors $v_1, \ldots, v_n$, and obtain $e_1, \ldots, e_n$ from the Gram-Schmidt procedure, then it has the additional (sometimes overlooked) property that $$\operatorname{span}\{e_1, \ldots, e_i\} = \operatorname{span}\{v_1, \ldots, v_i\}. \tag{$\star$}$$ In particular, since $e_{i+1}$ is perpendicular to each vector in $\{e_1, \ldots, e_i\}$, and hence to each vector in $\operatorname{span}\{e_1, \ldots, e_i\}$, it will also be perpendicular to each vector in $\operatorname{span}\{v_1, \ldots, v_i\}$, including $v_1, \ldots, v_i$. That is, each vector we obtain from Gram-schmidt will be orthogonal to every previous vector in the list.
To prove $(\star)$, it is best to use induction. Note that $e_1 = v_1$, so $\operatorname{span}\{v_1\} = \operatorname{span}\{e_1\}$. That is, the base case holds.
Suppose $(\star)$ holds when $i = k$. We have $$e_{k+1} = v_{k+1} - \sum_{i=1}^k \frac{v_{k+1}^\top e_i}{e_i^\top e_i} e_i.$$ Note that the subtracted sum lies in $\operatorname{span}\{e_1, \ldots, e_k\}$, which is equal to $\operatorname{span}\{v_1, \ldots, v_k\}$, by assumption. Thus the sum is a linear combination of $v_1, \ldots, v_k$, which makes $e_{k+1}$ a linear combination of $v_1, \ldots, v_{k+1}$. That is, $$e_{k+1} \in \operatorname{span}\{v_1, \ldots, v_{k+1}\}.$$ By assumption, $e_1, \ldots, e_k$ all lie in the smaller space $\operatorname{span}\{v_1, \ldots, v_k\}$, and so they also lie in the large space. Hence $$\operatorname{span}\{e_1, \ldots, e_{k+1}\} \subseteq \operatorname{span}\{v_1, \ldots, v_{k+1}\}.$$ On the other hand, $$v_{k+1} = e_{k+1} + \sum_{i=1}^k \frac{v_{k+1}^\top e_i}{e_i^\top e_i} e_i \in \operatorname{span}\{e_1, \ldots, e_{k+1}\},$$ and $v_1, \ldots, v_k \in \operatorname{span}\{e_1, \ldots, e_k\}$, by assumption, thus $$\operatorname{span}\{e_1, \ldots, e_{k+1}\} = \operatorname{span}\{v_1, \ldots, v_{k+1}\},$$ completing the induction proof.