I understand how Newton Method comes naturally from linear Taylor approximation of the function.
For $f: \mathbb{R}\to \mathbb{R}$ we have $$ x_{new}=x+\Delta x \\ f(x+\Delta x)=f(x)+f'(x)\Delta x=0 \\ f'(x)\Delta x=-f(x) \\ \Delta x=-\frac{f(x)}{f'(x)} \\ x_{new}=x-\frac{f(x)}{f'(x)} $$
For vector field $\textbf{F}: \mathbb{R^{n}}\to \mathbb{R^{n}} $ we have $$ \textbf{x}_{new}=\textbf{x}+\Delta \textbf{x} \\ \textbf{F}(\textbf{x}+\Delta \textbf{x})=\textbf{F}(\textbf{x})+\textbf{J}(\textbf{x})\Delta\textbf{x}=0 \\ \textbf{J}(\textbf{x})\Delta\textbf{x}=-\textbf{F}(\textbf{x}) \\ \Delta\textbf{x}=-\textbf{J}(\textbf{x})^{-1}\textbf{F}(\textbf{x}) \\ \textbf{x}_{new}=\textbf{x}-\textbf{J}(\textbf{x})^{-1}\textbf{F}(\textbf{x}) $$
I'm working with scalar field, but I can't figure out Newton Method for it.
For scalar field $f: \mathbb{R^{n}}\to \mathbb{R}$ $$ \textbf{x}_{new}=\textbf{x}+\Delta \textbf{x} \\ f(\textbf{x}+\Delta \textbf{x})=f(\textbf{x})+\nabla f(\textbf{x})\cdot \Delta\textbf{x}=0 \\ \nabla f(\textbf{x})\cdot \Delta\textbf{x}=-f(\textbf{x}) \\ ? $$
Dot product or non-square matrices doesn't have reciprocal inverse. Am I right that we don't have Newton Method for scalar fields because the derivation of it through Taylor approximation leads to this underdetermined equation $\nabla f(\textbf{x})\cdot \Delta\textbf{x}=-f(\textbf{x})$, which have infinitely many solutions? Can we still somehow use Newton Method or something alike for this kind of problems?
As you correctly identified, when you have a scalar field the system $$ f(x) + \nabla f(x) \cdot \Delta x = 0 $$ has an infinite number of solutions. One approach is to choose the "nearest" solution, in the sense that $$ x_{\text{new}} = \arg\min_{y} \left\{\|y - x\|^2 \mid f(x) + \nabla f(x) \cdot (y - x) = 0 \right\}. $$ The program above yields a closed-form update, given by $$ x_{\text{new}} = x - \frac{f(x)}{\|\nabla f(x)\|^2} \cdot \nabla f(x). \qquad (\dagger) $$ The update $(\dagger)$ is sometimes known in the literature as the Polyak step size.