Finding gradient descent of soft-margin multiclass SVM with different conditions

171 Views Asked by At

this is a homework question that I need some help with. PM me or something if you think I should take this question down

So I have the objective functionand the loss function of a multi-class svm.

I am asked about the gradient descent $\frac{\partial L((\omega_1,....,\omega_k), (x,y))}{\partial \omega_{j,l}}$ in the conditions where

$\hat{y} = argmax_{y'\neq y}\omega_{y'}^T x$

$\omega_{j,l}$ is the lth entry in $\omega_{j}$ and

$x_l$ is the lth entry in x

when

1) $\omega_y^T x < \omega_\hat{y}^T x + 1$

2) $\omega_y^T x < \omega_\hat{y}^T x + 1$ and j = y

3) $\omega_y^T x < \omega_\hat{y}^T x + 1$ and j = $\hat{y}$

4) $\omega_y^T x < \omega_\hat{y}^T x + 1$ and $j \neq y$ and $j \neq \hat{y}$

Just don't understand why these four conditions would matter, and how I should approach this question?