Inverse Jacobian Matrix

521 Views Asked by At

I have a question about using the Jacobian matrix with the Newton-Raphson method.

The regular Newton-Raphson method is initialized with a starting point $x_0$ and then iterated

$x_{n+1}=x_n-\dfrac{f(x_n)}{f'(x_n)}$.

If the derivate of a multivariate system is $J(x)$, given by

$J(x,y,z) = \begin{bmatrix} \dfrac{\partial f_1}{\partial x} & \dfrac{\partial f_1}{\partial y} & \dfrac{\partial f_1}{\partial z}\\ \dfrac{\partial f_2}{\partial x} & \dfrac{\partial f_2}{\partial y} & \dfrac{\partial f_2}{\partial z} \\ \dfrac{\partial f_3}{\partial x} & \dfrac{\partial f_3}{\partial y} & \dfrac{\partial f_3}{\partial z}\end{bmatrix}$

then the Newton method becomes

$x_{n+1}=x - J(x)^{-1}F(x)$

for $x$, and so on for $y$ and $z$. What I'm confused about is how to get from the first version of the Newton method to the second. Is $J^{-1}(x)$ the same as $1/J(x)$, or am I not understanding something? My knowledge of matrixes is extremely limited, but I need to be able to explain and understand this for a report I'm doing.

1

There are 1 best solutions below

0
On

There is no division of vectors by matrices. You use the linearization $$ F(x+s)=F(x)+J(x)·s+O(\|x\|^2) $$ and solve the linear system on the right side $$ 0=F(x)+J(x)·s $$ to find an approximate solution of the non-linear problem. After that you try again with $x_+=x+s$ to get a (hopefully) better approximation.

In that sense, the one-dimensional case is the exception, in that $1\times 1$ matrices can be treated as numbers.