I am stuck looking for a solution for the 2nd order ODE
$$ x''(t) + \omega (t) x(t) =0$$
with $ \omega(t)=-1+3(1-e) \, nd \left( \sqrt{\frac{1+e}{2}} t, \ \frac{2e}{1+e} \right)^2$, where $nd$ is the Jacobi elliptic function and $0 \leq e \leq 1$.
I have tried an exponential ansatz
$$ x(t)=\mathrm{e}^{\int_0^t \mathrm{d}t' f(t') } \; ,$$
but this only leads me to a first order inhomogeneous ODE
$$ f(t)^2+f'(t)=-\omega(t) \; .$$
I have solved it numerically, but I know this has an analytical solution (it's been done, but not explicitly given in a paper) which I don't know how to find.
I have also checked identities for squares and second derivatives of Jacobi elliptic functions, but no luck.
Does someone have any specific tricks or pointers?