How does a hinge loss function work?

303 Views Asked by At

I have been taking this course of Artificial Neural Network online and can't understand what the expression:

enter image description here

means. What does max{0,1 - this part I understand } mean?

I have searched online, but there, they are explaining everything except for how the expression works. Thank you.

2

There are 2 best solutions below

0
On BEST ANSWER

When training a support vector machine, we hope that we will have $w \cdot x_i + b \geq 1$ for positive examples (with $y_i = 1$) and also that $w \cdot x_i + b \leq -1$ for negative examples (with $y_i = -1$). In other words, we hope that $y_i (w \cdot x_i + b) \geq 1$ for all $i$. If the $i$th training example satisfies this condition, then there is no penalty for the $i$th training example. If the $i$th training example violates this condition, then we pay a penalty that is equal to the amount by which the condition is violated.

0
On

The expression means that we choose whichever part in the curly brackets is the maximum, so the expression evaluates to $0$ if $1 - y_i(w \cdot x_i + b) < 0$ and to $1 - y_i(w \cdot x_i + b)$ otherwise