An expression for $g(x)$ where $g$ is a two-class discriminant function

63 Views Asked by At

I'm reading Chapter 5.2.1, The Two-Category Case of Duda and Hart's Pattern Classification, where a discriminant function for two-class classification is given by
$$ g(x) = w^T x + w_0 $$

where $w$ is a weight and $w_0$ is a scalar bias, and $x, w \in \mathbb{R}^d$. Then for two points on the decision surface, a hyperplane $H$, $x_1, x_2$, we have:

$$ 0=g(x_1)=g(x_2)=w^T x_1 + w_0 = w^T x_2 + w_0 \implies w^T(x_1-x_2)=0$$

Then we can write $$x = x_p + r \frac{w}{||w||}$$ where $x_p$ is the projection of $x$ to the hyperplane $H$, and $r$ is the distance from $x$ to the hyperplane.

The authors then write:

Then since $g(x_p) = 0$, $$g(x) = w^Tx + w_0 = r||w||$$

From what I understand, since $x_p$ is on the hyperplane $H$, then $w^T x_p = 0$, and so $$g(x) = g(x_p+ r \frac{w}{||w||}) = w^T(x_p + r \frac{w}{||w||}) + w_0 = r ||w|| + w_0$$

So I'm wondering where the $w_0$ went? I'm not sure how the authors obtained the desired expression. Any insights appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

The hyperplane $H$ is defined, not by the equation $w^Tx=0$, but by the equation $w^T x +w_0=0$. Which is why the $w_0$'s get canceled out in the expansion of $g(x)$.