for the purpose of plotting a function i want to calculate the (minimum) distance of a point to this function. (I want to calculate the distance of each pixel to the desired function)
As an example, lets take the function
$f(x) = x^2$
and the point
$P = (1,0)$
First, I calculate the distance to every point on the function:
$d = \sqrt{(x-1)^2 + (y-0)^2}$
$d = \sqrt{(x-1)^2 + (x^2-0)^2}$
$d^2 = x^4+x^2-2x+1$
Now I basically just create a minimalization problem out of this to get the minimum distance. I would form the derivative and solve it for 0 to get the minima of the distance and choose the smallest result (I minimize the square of the distance to make it easyer to calculate):
$(d^2)' = 4x^3+2x-2 = 0$
However, this is not very generalizable. Lets say, our function is not $x^2$ but rather something like $10 * sin(x)$. In that case the resulting distance function would have a lot of local minima. Also when i program this, i would need the computer to calculate derivatives of any function that i choose to plot.
How do graphing/plotting programs calculate this? Is there not a simpler, more generalizable way to solve this problem? Maybe by leveraging the fact, that when we calculate the distance, only at the shortest distance we have exact 1 solution to the equation (think of a circle around the point intersecting the function)
I will be very thankful about your help!
Hint: This can be solved easily using Lagrange multipliers.
The answer:
You can consider the function you want to minimise as the distance(or it's square) from your point $P = (p_x, p_y)$: $$ d(x, y) = (x-p_x)^2 + (y-p_y)^2$$ and your constraint function as $g(x, y) = f(x)-y$.
Now all you need to do is find the critical points of $d(x, y) + \lambda g(x, y)$. You can generalise it to higher dimensions or other distances.
Also, I am obviously assuming the function $f(x)$ to be well-behaved here.