Let $f\colon\Bbb{R}^n\to\Bbb{R}$ and $\mathbf{x}_0\in\Bbb{R}^n$. How could I (numerically) find the minimum Euclidean distance between the curve $f(\mathbf{x})=0$ and $\mathbf{x}_0$, granted that $f$ is given analytically. I am looking for an numerical solution in $C$ programming language.
Apologies if this question does not fit the site's requirements, but I think I would have more luck in MathSE, rather than in StackOverflow.
Thanks in advance!
This is an optimization problem with on equality constraint. By the method of Lagrange multipliers, you have to consider the function $$ L(x,\lambda)=\frac12\|x-x_0\|^2 + λ·f(x) $$ and find critical points for it. These satisfy $$ 0=f(x)\\ 0=\frac{\partial L}{\partial x}=x-x_0-λ·\nabla f(x) $$ which you can try to solve via Newtons method.