I am interested in knowing the best known worst-case complexity of finding a solution to a multilinear system of equations (with some additional properties). More specifically, suppose I have $n$ equations of the form
$$f_i = b_i$$
for $i$ ranging from 1 to $n$, where each $f_i$ is a multilinear homogeneous degree-$d$ function of the input variables $x_1,\dots,x_n \in \mathbb{R}$ (or $\mathbb{C}$) [where $2 < d << n$, and each $b_i$ is simply a given scalar. Suppose also that each $f_i$ has $n$ monomial terms. Two natural questions one could ask: (1) What is the complexity of finding some $x_1,\dots,x_n$ such that $f_1 = b_1, \dots, f_n = b_n$, assuming the $b_i$'s and all the coefficients of all the $f_i$'s are rational (and bounded)? (2) What is the complexity of finding some $x_1,\dots,x_n$ such that $|f_i - b_i| < \epsilon$ for all $1 \le i \le n$, given an $\epsilon > 0$, when the $b_i$'s and coefficients may be arbitrary (bounded) real (or complex) numbers?
I have found it surprisingly difficult to answer this question. I have found a significant body of work addressing the complexity of finding/characterizing ALL the solutions to a system of polynomial equations [with rational coefficients] (which seems to require at least exponential time (i.e. proportional to $d^n$) in general - because even if there are a finite number of roots, the number of these roots may be exponential). However, I was not able to elucidate the complexity of finding a SINGLE solution, i.e. a single assignment of $x_1,\dots,x_n$ satisfying (or approximately satisfying) the equations. I would expect the problem of just finding a single approximate root to be "easier" than the problem of finding all roots. My question, specifically, is: is either (1) or (2) solvable in time polynomial in $n$ and $2^d$ [and $1/\epsilon$, in the case of (2)]? If so, how?