How to find the orthogonal of a vector

6.1k Views Asked by At

Suppose that we have a vector $w = \begin{bmatrix} 3\\ 4.654\\ 3.34\\ 4.234\\ -1.23\\ \end{bmatrix}$

and we want to find an orthogonal to $w$. I have read about Gram-Schmidt process for me the most simple is the following one. Since the dot product should be $0$, i randomly fix 4 random elements and i compute their inner product $p$ with the first 4 dimensions of vector $w$. Then in order to find the last coefficient of the orthogonal vector i divide $p$ with -1.23 (the last coefficient). Is this a wrong method? And why Gram Schmidt is being preferred?

3

There are 3 best solutions below

5
On BEST ANSWER

The Gram-Schmidt process is a systematic way of finding a whole set of orthogonal vectors that form a basis for a space spanned by given vectors. In your case, you're given only one vector, and are tasked with finding another, and the procedure you mention would find two orthogonal vectors in a 5 dimensional space. How would you find a third orthogonal vector? Extending your logic, you would have a new vector with three fixed coordinates and two unknowns, then require it to be orthogonal to the first two, giving two equations for the two unknowns. You would solve the 2x2 system to find a third orthogonal vector. Then for a fourth, you would fix two coordinates and have three unknowns, and now solve a 3x3 system. Similar logic shows that the fifth orthogonal vector would require solving a 4x4 system of equations. What's worse, it's not trivial to pick coordinates to fix and randomly set their values to begin with; if its random, you might accidentally pick something already spanned by the basis you've built so far, in which case the procedure would break down.

The Gram-Schmidt process is simply a consistent way of doing the same thing, that is guaranteed to find an orthonormal basis. It is certainly simpler to think through and apply consistently when you are solving for an orthogonal basis. The only operations involved are dot products, vector sums, and scalar multiplication; you never have to solve a system of equations when you use it. These are the reasons that Gram-Schmidt is preferred.

6
On

Here is how you advance. Assume the wanted vector is

$$ u = \begin{bmatrix} a\\ b\\ c\\ d\\ e\\ \end{bmatrix}, $$

then take the dot product $u.w=0$ and then solve the resulting system. See here.

2
On

Your method should work unless $w$ happens to be perpendicular to the space of vectors with zero last coordinate. Then it will give you a trivial solution.

If you want more control over the solution, then you can find a parametrization of the solutions of the homogeneous system

$$ x\cdot w = 0$$

through row-reduction. This gives you generators of the space of solutions, and you can pick parameters to your liking.