I'm trying to find the value of a global minimizers of a multivariate polynomial (4 variables) of high order numerically. The numerical values of the coefficients are coming from noisy measurements and they contain random errors. Is there a way to understand how this affects the solution?
If I used the true coefficients, I get exact solutions. However, with coefficients containing errors (correlated errors), the value of the minimizer is not close to the true ones.
Generally, representing polynomials in "power basis" form is very bad for numerical stability. Better to use Bernstein or Chebyshev or Lagrange forms. The celebrated example of Wilkinson's polynomial shows how badly things can go wrong, even in the univariate case.
For more information about this, you could start with Farouki's work on the stability of the Bernstein basis. For example, this paper, or this one.
I don't know if this has been done with multivariate polynomials, but I'd guess that it can be done. It seems to me that all the same reasoning applies. The basic idea is that stability is achieved by using coefficients that have some geometric meaning (as opposed to the power basis coefficients, which are really just glorified derivatives). This same reasoning holds true in any dimension, it seems to me.
Some other work that might be relevant (possibly) is all the discussion of stability of high-degree polynomial calculations in the chebfun system. Maybe take a look at this paper. I'm not 100% sure that it's relevant to your problem, but it might be. The chebfun guys point to some rigorous error/stability analysis done by Higham.