I have a function $ f = \sqrt{ (x_i -x_j)^2 +(y_i-y_j)^2 }$ and I want to find the extremal points. Therefore, I calculated the gradient:
$ g= \nabla f = \frac{1}{\sqrt{(x_i -x_j)^2 +(y_i-y_j)^2}} \begin{bmatrix} x_i - x_j \\ x_j - x_i \\ y_i -y_j \\ y_j - y_i \end{bmatrix}$.
Then in defined: $ \Delta x := x_i -x_j$ and $\Delta y := y_i - y_j$. So,
$ g= \frac{1}{\sqrt{ \Delta x ^2 + \Delta y^2}} \begin{bmatrix} \Delta x \\ -\Delta x \\ \Delta y \\ -\Delta y \end{bmatrix} = 0$.
How to calculate the extrema and handle the singularity?
thanks for your help!
EDIT:
I forgot to add information about the domain: $f: \mathbb{R}^4 \rightarrow \mathbb{R}$ I also know that the minimum will occur at $\Delta x =\Delta y=0$. But how to prove this mathematically, i.e. handling the singularity?
The function $f$ is clearly monotonic increasing in $\Delta x^2$ and $\Delta y^2$. Therefore, the curve has only one minima, at $\Delta x = \Delta y = 0$.
Unfortunately, you can't get this from the gradient, since it doesn't exist at the origin. To see this, use a limit approach:
$$\lim_{\Delta x, \Delta y \to 0}\frac{\Delta x}{\sqrt{\Delta x^2 +\Delta y^2}}=\frac{1}{\sqrt{1 +\Delta y^2 / \Delta x^2}}$$ Note that this depends on the slope of the line we use of get to the origin, so that the limit doesn't exist.