I have the following weighted Jaccard metric:
$D_{WJ}(X_{i},X_{j})=1- \cfrac{\Sigma_{k} min(X_{ik},X_{jk})}{\Sigma_{k} max(X_{ik},X_{jk})} ; X_{i},X_{j}\in R^{n}$
I want to find the derivative of $D_{WJ}$ with respect to the components in order to perform some sort of gradient descent algorithm.
I can't seem to find a closed expression for the derivative due to the $min$/$max$.
reasoning: I'm trying to perform and iterative procedure that is dependent on the $D_{WJ}$ so I need to come with a step progress in the desired direction prior to implementing it
would appreciate some help.
$$\begin{align} & \min(x,y) \begin{cases}x & \text{if $x<y$,} \\ y & \text{otherwise}\end{cases} \\ \implies \quad & \frac{\mathrm d}{\mathrm dx}\min(x,y) =\begin{cases}1 & \text{if $x<y$,} \\ 0 & \text{otherwise,}\end{cases} \end{align}$$ and similarly for the second argument and for the case of $\max$. Then you can apply the quotient rule / chain rule / etc. to calculate the full derivative.