Gram-Schmidt Procedure for Orthogonal Vectors?

53 Views Asked by At

Let $Q$ be a positive definite matrix and $\xi_1, \ldots, \xi_n \in \mathbb{R}^n$ be a set of linear independent vectors. Using the Gram-Schmidt procedure, define $d_1 = \xi_1$ and assume $d_2 = \xi_2 + c_{21}d_1$ where $c_{21}$ is a scalar.

Question: Compute the value $c_{21}$ such that $d_1^TQd_2 = 0$

Now apparently my linear algebra is rusty because if I simply plug in the given definitions:

$d_1^TQd_2 = \xi_1^TQ[\xi_2 + c_{21}\xi_1] = 0$

and $c_{21}$ should be a scalar. Vectors are not invertible in general, so I don't necessarily know how to solve for $c_{21}$. Does the fact that $\xi_1$ is linearly independent of $\xi_2$ make it so $c_{21}$ results in a scalar?

Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

Linear independence isn't necessary for this step. Use linearity to distribute your expression via $$0 = \xi_1^\top Q (\xi_2 + c_{21} \xi_1) = \xi_1^\top Q \xi_2 + c_{21} (\xi_1^\top Q \xi_1)$$ and solve for $c_{21}$, noting that $u^\top Q v$ is a scalar for any vectors $u,v$. (Positive-definiteness of $Q$ is necessary to avoid division by zero.)