Solving vectors such that the dot product = 0

2.4k Views Asked by At

I'm doing some machine learning problems (namely logistic regression), and something I'm trying to do is calculate the decision boundary given a weight vector $\mathbf{w}$.

The decision boundary lies on $\mathbf{x}$ such that $\mathbf{w}^T\mathbf{x} = 0$, and $\mathbf{w}^T$ is a $1 \times d$ vector, $\mathbf{x}$ is a $d \times 1$ vector.

What would be the simplest way to find an $\mathbf{x}$, given $\mathbf{w}$, such that the dot product is equal to $0$?

An example is $\mathbf{w} = [-1, 1, 0]$, and by brute force, I can find the vector $\mathbf{[1, 1, 0]}$, so $\mathbf{x}_1 = 0$ is the decision boundary (note that in the $\mathbf{x}$ vector, the first term is ALWAYS 1, so this can be omitted from the equation). Then essentially, the graph is 2D, and the decision boundary is at $\mathbf{x}_1 = 1$.

Any guidance on the best way to solve for large $d$ would be appreciated.

3

There are 3 best solutions below

5
On BEST ANSWER

Because you have a underdetermined equation (less equations than unknowns) you need to assign dummy variables.

lets say our vecotr $w=(w_1,w_2,w_3,w_4)^T$. Then $w^Tx=w_1+w_2x_2+w_3x_3+w_4x_4=0$.

We know that we have one equation and 3 unknows. Hence, we can choose 3-1=2 free variables. Let us call $x_2=a$ and $x_3=b$.

Now solve the equation for $x_4$.

$$x_4=(-w_1-w_2a-w_3b)/w_4$$.

We can conclude from this that $x=(1,a,b,-\frac{w_1}{w_4}-\frac{w_2}{w_4}a-\frac{w_3}{w_4}b)^T$. This is a 2D plane in 4D space.

EDIT: In order to see the plane structure better, we can split this expression:

$x=(1,0,0,-\frac{w_1}{w_4})^T+a(0,1,0,-\frac{w_2}{w_4})^T+b(0,0,1,-\frac{w_3}{w_4})^T$. Now it is obvious that the reference point to the plane is $(1,0,0,-\frac{w_1}{w_4})^T$ and the directional vectors are $(0,1,0,-\frac{w_2}{w_4})^T$ and $(0,0,1,-\frac{w_3}{w_4})^T$. And $a$ and $b$ are free parameters.

0
On

Assuming $w_1$ is non-zero:

For any other $w_j$ non-zero you will have a solution with $x_1=1$, $x_j=-\dfrac{w_1}{w_j}$ and other $x_k=0$.

You can then take combinations of such solutions to produce even more solutions.

If any of the $w_j$ are zero, you can also add to these solutions vectors where the corresponding $x_j$s take any values.

So

  • in your example with $w=[-1,1,0]$ you will have solutions of the form $x=[1,1,d]$
  • in your example with $w=[3,-1,2]$ you will have solutions of the form $x=[1,3c,-\frac32(1-c)]$

for all real numbers $c$ and $d$, giving $w \cdot x =0$

0
On

You can also use the Gram-Schimdt process. Knowing $w$ and starting with a random vector $u$, you can find $x$ by writing

$x = u - (w^T*u)/(w^T*w)*w$.

Matlab example:

>> w = rand(4,1)
w =
    0.9575
    0.9649
    0.1576
    0.9706
>> u = rand(4,1)
u =
    0.4218
    0.9157
    0.7922
    0.9595
>> x = u - (w'*u)/(w'*w)*w
x =
   -0.3755
    0.1124
    0.6610
    0.1514
>> w'*x
ans =
     0