I am looking for a proof of the Newton's method, in internet, for approximate the root of a system, that is, the Newton's method for a vectorial function of the form $F(X):=(f_{1}(x_{1},\ldots,x_{n}),\ldots,f_{1}(x_{1},\ldots,x_{n}))$.
In the below link (p.9 of the PDF)
https://www.lakeheadu.ca/sites/default/files/uploads/77/docs/RemaniFinal.pdf
there is a proof of the quadratic convergence of the method, but it said that "H is the Hessian tensor, which is similiar to the Hessian matrix". I do not what this means...
Thanks you for your time!
The second derivative has three indices $$ \frac{∂^2f_i}{∂x_j∂x_k} $$ which makes it, as object acting linearly on vectors, more precisely bilinearly on pairs of vectors, a third order tensor. Coordinate free one could write it as vector valued function $$ F''(x)[u,v] =\left(\sum_j\sum_k\frac{∂^2f_i}{∂x_j∂x_k}·u_i·v_k\right)_i $$ which for fixed $x$ is linear in $u$ and $v$ separately.
For the Newton method one uses the linear Taylor polynomial. Of the remainder terms, only the integral form translates to the multivariate vector-valued case without problems $$ F(x+v)=F(x)+\int_0^1F'(x+sv)[v]\,ds\\=F(x)+F'(x)[v]+\int_0^1(1-s)F''(x+sv)[v,v]\,ds $$