Signed Distance of a Point to a Hyperplane

349 Views Asked by At

Let's define a affine function $f(x) = \alpha^Tx+\alpha_0$ where $\alpha$ is a vector (weights) and $\alpha_0$ is a constant (bias). Also, define a normal vector of the hyperplane as $n=\frac{\alpha}{||\alpha||_2}$.

According to linear algebra notation, the signed distance between point to a hyperplane is defined as:

$d=\frac{||\alpha||_2}{\alpha^T\alpha}f(x)=\frac{f(x)}{||\alpha||_2}=\frac{f(x)}{||\nabla{f(x)}||_2}$

I am not sure that I fully grasp how we get the last step. Also, how we come up with the $d$. I know how the first two steps are related because of this equation $\sqrt{<\alpha,\alpha>}=||\alpha||_2$.

But how does the $||\nabla{f(x)}||_2=||\alpha||_2$ ? And how did we come up with d?