It is well known that if we have a system of linear equations of $n$ variables, we need $n$ equations to solve the system. I wonder if we can say the same thing for a system of polynomial equations of $n$ variable? More specifically, I have an assumption that this system has one and only one real solution. It seems like the answer is yes, but what could be a rigorous way to prove it, or at least to reason about it?
In addition, the above requirement for linear systems also assumes the $n$ equations are linearly independent. What would be an equivalent statement if we were to analyze the system of polynomial equations?
For context, I am trying to analyze the complexity of a learning algorithm that boils down to solve a system of polynomial equations. I would like to know how many (and what kind of) equations the algorithm needs to construct in order to find the (unique) solution as this helps quantify the complexity. I do not need to actually solve this system.
As suggested by @Ethan, adding a concrete example: think of an example that we try to do parameter estimation using the method of moments, $\mathbb E[X^i]=f_i(\theta), \theta\in \Theta \subseteq \mathbb R^n, i\in\mathbb N^+$ where all the moments $f$ are polynomial functions of $\theta$. Given the ground truth value of all the moments, the goal is to solve for $\theta$ (though how to solve is not within the scope of the discussion). Assume the model is identifiable (i.e. the system has a unique solution), how many such "moments equation" do we need in order to solve the system? Is it the first $n$ moments?