I'm currently studying polynomial interpolation through Newton and Lagrange, and I've generated a polynomial approximation function with the following data points:
| t | -1|-0.5 |-0.25|0.25| 0.5| 4|
|f(t) |-637|-96.5|-20.5|20.5|96.5|637|
I've generated the following interpolation polynomial with a degree of 5:
p(x) =−29.8236x5−29.8236x4+ 601.3198x3+ 9.3198x2+ 44.5340x−0.4659
I have plotted it using SciPy, and this is the graph p(x) generates:

Why does that behavior happen from t = 0.5 to t = 4?
Is it due to my polynomial being of a high degree? Should I try curve fitting instead?
EDIT: I inverted the table, as I was considering f(t) as t before.