why does polynomial fit work much better for only positive or only negative values of the dependent variable?

967 Views Asked by At

I am trying to fit a force field by using the approximation $a(z) = a_1 z + a_2 z^2 + a_3 z^3 +$ ... and find the coefficients $a_1$, $a_2$, $a_3$, etc.

When I use numpy's polyfit for $z$ values ranging from $-zlim$ to $+zlim$, I get a fit which is not great at $z=0$ and also deviates very quickly from the exact $a(z)$: see plot here

However, the fit drastically improves if I divide the range of $z$ values into positive and negative, i.e., I ask polyfit to only fit the exact $a(z)$ for $-z$, and then separately do a fit for $+z$. Then I hstack the two fits and evaluate them using polyval: see improved plot here

Also, for the two different fits of positive & negative values of $z$, the only difference between the coefficients is that they're equal and opposite. My main question is why this happens, and I have a feeling it should be possible to translate this into the interpretation of a Taylor expansion-- is that possible?

Here is the output for the two different sets of coefficients:

coefficients for $z>0$:
[-5.99102074e-01, 3.70894968e+00, -8.87243768e+00, 1.03878023e+01, -6.58982157e+00, 3.33239998e-03]
coefficients for $z<0$:
[-5.99102074e-01, -3.70894968e+00, -8.87243768e+00, -1.03878023e+01, -6.58982157e+00, -3.33239998e-03]

I understand that I would be better of just using python's Taylor expansion, but I don't have an analytical exact $a(z)$, I only have a numerical function which I'd like to fit using an expansion.

1

There are 1 best solutions below

0
On

The exact (blue) curve seems to go to $ \pm \infty$ "slowly", i.e. with a power of $x$ less than $1$, for example as $x^{1/2}$.
Therefore a polynomial is for sure not the best approximation, and you might experience the Runge's phenomenon.

Moreover the curve is anty-symmetric, which means that you should use only odd powers of $x$.