I'm doing some machine learning problems (namely logistic regression), and something I'm trying to do is calculate the decision boundary given a weight vector $\mathbf{w}$.
The decision boundary lies on $\mathbf{x}$ such that $\mathbf{w}^T\mathbf{x} = 0$, and $\mathbf{w}^T$ is a $1 \times d$ vector, $\mathbf{x}$ is a $d \times 1$ vector.
What would be the simplest way to find an $\mathbf{x}$, given $\mathbf{w}$, such that the dot product is equal to $0$?
An example is $\mathbf{w} = [-1, 1, 0]$, and by brute force, I can find the vector $\mathbf{[1, 1, 0]}$, so $\mathbf{x}_1 = 0$ is the decision boundary (note that in the $\mathbf{x}$ vector, the first term is ALWAYS 1, so this can be omitted from the equation). Then essentially, the graph is 2D, and the decision boundary is at $\mathbf{x}_1 = 1$.
Any guidance on the best way to solve for large $d$ would be appreciated.
Because you have a underdetermined equation (less equations than unknowns) you need to assign dummy variables.
lets say our vecotr $w=(w_1,w_2,w_3,w_4)^T$. Then $w^Tx=w_1+w_2x_2+w_3x_3+w_4x_4=0$.
We know that we have one equation and 3 unknows. Hence, we can choose 3-1=2 free variables. Let us call $x_2=a$ and $x_3=b$.
Now solve the equation for $x_4$.
$$x_4=(-w_1-w_2a-w_3b)/w_4$$.
We can conclude from this that $x=(1,a,b,-\frac{w_1}{w_4}-\frac{w_2}{w_4}a-\frac{w_3}{w_4}b)^T$. This is a 2D plane in 4D space.
EDIT: In order to see the plane structure better, we can split this expression:
$x=(1,0,0,-\frac{w_1}{w_4})^T+a(0,1,0,-\frac{w_2}{w_4})^T+b(0,0,1,-\frac{w_3}{w_4})^T$. Now it is obvious that the reference point to the plane is $(1,0,0,-\frac{w_1}{w_4})^T$ and the directional vectors are $(0,1,0,-\frac{w_2}{w_4})^T$ and $(0,0,1,-\frac{w_3}{w_4})^T$. And $a$ and $b$ are free parameters.