Consider the function ${\bf f}:\mathbb{R}^n\to\mathbb{R}^m$ defined as ${\bf f} = (f_1,f_2,\ldots,f_m)$ where each $f_i:\mathbb{R}^n\to\mathbb{R}$ is twice-continuously differentiable convex in ${\bf x}\in\mathbb{R}^n$.
I am looking for a numerical algorithm to solve $${\bf f}({\bf x}) = {\bf 0}$$
If there is a solution to the above equations, does any numerical method guarantee convergence to it?
EDIT: There is a result for the one dimensional case where Newton's method is guaranteed to converge to the zero for an increasing, convex function. Does this result generalize for the above case?
EDIT: We can additionally say that the $f_i$'s are quadratic, that is, they can be written in the form $f_i({\bf x}) = {\bf x}^{\top}{\bf A}_i{\bf x}+{\bf b}_i^{\top}{\bf x} + c_i$ where each ${\bf A}_i$ is positive semi-definite and ${\bf b}_i\in\mathbb{R}^n$, $c_i\in\mathbb{R}$.
Given your quadratic edit, here's an approach that is used with other non-convex quadratic problems. Let $X$ be a new variable, a square symmetric matrix of size $n$. Then your problem is equivalent to this system of equations: $$ \langle{A_i,X\rangle}+b_i^Tx+c_i=0, ~ i=1,2,\dots,N; \qquad X = xx^T $$ These equations describe a non-convex set; but if we relax that last equation to a semidefinite inequality, we get a convex set: $$ \langle{A_i,X\rangle}+b_i^Tx+c_i=0, ~ i=1,2,\dots,N; \qquad X \succeq xx^T $$ where $X\succeq xx^T$ means that $X-xx^T$ is positive semidefinite. To solve this, we first note that $$X \succeq xx^T \quad\Longleftrightarrow\quad \begin{bmatrix} X & x \\ x^T & 1 \end{bmatrix}$$ and consider this convex optimization problem (actually a semidefinite program): \begin{array}{ll} \text{minimize} & \mathop{\textrm{Tr}}(X) \\ \text{subject to} & \langle{A_i,X\rangle} + b_i^Tx + c_i = 0, \quad i=1,2,\dots,N \\ & \begin{bmatrix} X & x \\ x^T & 1 \end{bmatrix} \succeq 0 \end{array} If, when you solve this, you find that $X=xx^T$, then the $x$ obtained is exact. Otherwise, the value of $x$ obtained could be considered a candidate for a local search.
Because this is a relaxation, we know that solutions to the original problem are feasible points for this model. So, for instance, if $f(\bar{x})=0$ for some $\bar{x}$, then $(X,x)=(\bar{x}\bar{x}^T,\bar{x})$ satisfies the constraints of this convex model. This doesn't guarantee that solving the convex model will produce $\bar{x}$, of course. But I think there is reason to be optimistic this will work well.