Projection of one vector onto another

896 Views Asked by At

I have the following equation for a decision boundary line: $-w_0 = w_1x_1 + w_2x_2$ and I want to prove that the distance from the decision boundary to the origin is $l = \frac{w^Tx}{||w||}$. I am having trouble wrapping my mind around how I can just get the distance from a line to a point. Am I supposed to be averaging the distances of all the points on the line to the point?

1

There are 1 best solutions below

9
On BEST ANSWER

The distance of a line to a point (or from a point to a line) is the minimum distance (in your case euclidean distance) of the points in the line from the specific point (here the origin).

In other words, the distance is the length of the perpendicular line from origin to your decision boundary.

Also, $\frac{w^Tx}{||W||} $ is not the distance from the origin to your line. You might want to re check your calculations.

As @amd mentioned, $\frac{w^Tx}{||w||}$ is the distance from a point $x$ to the line defined by $w$. Hence replacing $x$ by origin the signed distance should be $\frac{-w_0}{||w||}$.