I have been taking this course of Artificial Neural Network online and can't understand what the expression:
means. What does max{0,1 - this part I understand } mean?
I have searched online, but there, they are explaining everything except for how the expression works. Thank you.

When training a support vector machine, we hope that we will have $w \cdot x_i + b \geq 1$ for positive examples (with $y_i = 1$) and also that $w \cdot x_i + b \leq -1$ for negative examples (with $y_i = -1$). In other words, we hope that $y_i (w \cdot x_i + b) \geq 1$ for all $i$. If the $i$th training example satisfies this condition, then there is no penalty for the $i$th training example. If the $i$th training example violates this condition, then we pay a penalty that is equal to the amount by which the condition is violated.