background: I'm working on a problem that uses the implicit function theorem to show the existence of a solution. I have a continuously differentiable function $f(x,y)=0$ with nonzero Jacobian at a point $(a,b)$. A direct corollary of the implicit function theorem asserts that for sufficiently small changes in $a$, say $a+\delta$ there is a $b+\epsilon$ such that $f(a+\delta,b+\epsilon)=0$.
Question: How do one find such a new solution in a concrete problem?
Example: Let's look at a simple function like $f(x,y)=x^2+y^2-1=0$. $(\frac{\sqrt{2}}{2},\frac{\sqrt{2}}{2})$ is a solution, and the Jacobian of $f$ is nonzero at this point. How do you find a solution of the form $(\frac{\sqrt{2}}{2}+0.1,y)$?
Note: I know that the above example can be solved directly, but I want to make use of a method (something like Newton's method maybe) for higher dimensional problems, where $x=(x_1,\ldots,x_m)$, and $y=(y_1,\ldots,y_n)$
In your example, the IFT guarantees the existence of a $C^1$ function $y$ defined in the neighborhood of $\frac{1}{\sqrt{2}}$ such that $f(x,y(x)) = 0$. Furthermore, $\frac{\partial y(\frac{1}{\sqrt{2}})}{\partial x} = -(\frac{\partial f(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}))}{\partial y})^{-1} \frac{\partial f(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}))}{\partial x} = -1$.
To solve $f(\frac{1}{\sqrt{2}}+0.1, y) = 0$, use Newton's method starting from the initial estimate $y_0 = y(\frac{1}{\sqrt{2}}+0.1) \approx y(\frac{1}{\sqrt{2}})+\frac{\partial y(\frac{1}{\sqrt{2}})}{\partial x} 0.1 = \frac{1}{\sqrt{2}}-0.1$.
This converges quickly (quadratically, of course):
Here is the Python script that produced the above...