Suppose we have an optimization problem for this general form of $f: \mathbb{R}^N \rightarrow \mathbb{R}$
$$\min_{x \in \mathbb{R}^N} f(x)$$
and this problem is solvable. How could I construct a suitable zero equation to solve the optimization problem?
I think if $x\in \mathbb{R}$, then we can construct the zero equation as $f'(x) = 0$. With the zero equation in hand then the solution can be found with iterations of Newton's method. However I am not sure how to proceed when $x \in \mathbb{R}^N$.
Any ideas would be appreciated!
If $f$ is differentiable, the minimiser will satisfy the equation $\nabla f = 0$, where $\nabla f = (f'_{x_1}, \cdots, f'_{x_n})$. Newton's method is also quite similar: if you want to numerically solve an equation $g(x) = 0$, where $g:\mathbb{R}^n\to \mathbb{R}^n$, you take an initial approximation $x^{(0)}$ and proceed using the iteration $$ x^{(k+1)}= x^{(k)} - [J_g(x^{(k)})]^{-1} g(x^{(k)}), $$
where $J_g$ is the Jacobian matrix of $g$. Please note that the convergence of Newton's method is by no means guaranteed, and even if it converges it can converge to a local minimum (which may not be the global minimiser), a local/global maximum or a saddle point.