Do multi variable bracketing techniques exist to find root?

225 Views Asked by At

I have been researching the classic bracketing optimization techniques like Bisection, Secant, Dekker and Brent method; however, none seem to have an application in multi variable calculus. Only Secant's method has been generalized in a multi variable setting, called Broyden's method; however, it uses a Secant-like method to approximate the Jacobian and subsequently applies Newton's method using this Jacobian approximation and thus does not really qualify as a bracketing method.

I have been approximating the root of a multivariable function $f(x)=(f_{1}(x_{1},...,x_{n}),...,f_{n}(x_{1},...,x_{n}))=0$ using multi variable Newton's method; however, I need to bracket the root with certainty in a multi variable setting.

Does anyone know an applicable bracketing technique in a multi variable setting or knows how to safely bracket a multi variable function after applying Newton's method?

1

There are 1 best solutions below

5
On

By bracketing in every component, one is able to iteratively solve more and more components at a time.


Example

Consider the following function:

$$f(x,y)=(x^2+y-1,y^2-2x-1).$$


Brackets

We can then see that for all $y\in[-3,-1.5]$,

$$0\in[f_1(1.5,y),f_1(2,y)]=[y+1.25,y+3],$$

and for all $x\in[1.5,2]$,

$$0\in[f_2(x,-1.5),f_2(x,-3)]=[1.25-2x,8-2x].$$

This means for $x\in\{1.5,2\}$, we can always find $y\in[-3,-1.5]$ where $f_1=0$, and for $y\in\{-3,-1.5\}$, we can always find $x\in[1.5,2]$ where $f_2=0$.


Solving

I would suggest drawing a diagram while going through the following steps, and taking note of the signs of $f_1$ and $f_2$ along the boundaries of $[1.5,2]\times[-3,-1.5]$.

Now fix $y_0=-3$ and solve for $x_0$ via bracketing from $f_1$:

$$f_1(1.5\le x_0\le2,-3)=0\implies x_0=\dots$$

Similarly fix $y_1=-1$ and solve for $x_1$ via bracketing from $f_1$:

$$f_1(1.5\le x_1\le2,-1.5)=0\implies x_1=\dots$$

Now we have $f(x_0,y_0)=(0,+) $ and $f(x_1,y_1)=(0,-)$. We then choose a new $y_0<y_2<y_1$ and repeat to solve for $x_2$:

$$f_1(1.5\le x_2\le2,y_2)=0\implies x_2=\dots$$

Now we have $f(x_2,y_2)=(0,?)$, where the sign of $f_2(x_2,y_2)$ determines which of $y_0$ or $y_1$ you should keep for the next iteration.


Such a process is tedious, especially as the number of dimensions increases, but it does show that multivariate bracketing methods are possible. I'm certain there are better approaches, but I have not seen that much concerning the topic in the literature.