I’m looking for finding an efficient answer to this problem, which is to find the period time of a pendulum using interpolation.
The pendulum behavior was given using the equations $\phi’’+\frac g L \sin(\phi)$, $\phi(0)=\frac{6\pi}{7}$, $\phi’(0)=0.8$,$0\leq t \leq T$ I’ve rewritten this to a system of first order differential equations and solved it using Runge-Kutta 4.
You can find an image of the plotted solution below, where the blue graph shows the pendulum movement: graph
I want to find an interpolation over a subset of the two periods I’ve plotted to find the period time, but I’m a little stuck trying to pick what to do. Particularly which points I’m supposed to interpolate over and also what degree of polynomial I should pick. I’ve heard that you could get away with a low (1st, 2nd) degree polynomial, but I really can not figure out how.

If I understand correctly, you want to compute the zero crossings from known data points on the curve.
An easy solution is to identify the two points on both sides of a change of sign and perform linear interpolation between them.
For better precision, as the derivatives are also available at these points, you can resort to inverse Hermite interpolation (interpolate $t$ as a cubic function of $\phi$).
Example with a much exaggerated step.