Partial derivative of squared Euclidean distance

397 Views Asked by At

I have a hard time getting the partial derivative of this function \begin{equation} d(\theta, \textbf{x}_i) = \sum_{j=1, j \neq i}^{n_v} \vert\vert f(\textbf{x}_i^+) - f(\textbf{x}_j^+)\vert\vert^2_2 - \vert\vert f(\textbf{x}_i^+) - f(\textbf{x}^-)\vert\vert^2_2 \end{equation} and define $\phi_i^+ = \vert\vert f(\textbf{x}_i^+) - f(\textbf{x}_j^+)\vert\vert^2_2$ and $\phi_i^- = \vert\vert f(\textbf{x}_i^+) - f(\textbf{x}^-)\vert\vert^2_2$ to simplify the equation.

So, i would like to find $\frac{\partial}{\partial \phi_i^+} d(\theta, x_i)$ and $\frac{\partial}{\partial \phi_i^-} d(\theta, x_i)$.

Any help will be appreciated.

I know that the partial derivative of the euclidean distance for \begin{equation} \frac{d}{dx} \vert\vert x - x_j\vert\vert = \sum_{j=1}^{n} \frac{\sum_{k=1}^{N} (x - x_j)_k}{\vert\vert x-x_j\vert\vert} \end{equation}

but how can I get for a squared euclidean distance. Thank you