Imagine you have an object that travels along a trajectory parameterized by time in the form:
$$\vec{p} = \begin{pmatrix} x_0 \\ y_0 \end{pmatrix} + \begin{pmatrix} v_x \\ v_y \end{pmatrix} t + \frac{1}{2} \begin{pmatrix} a_x \\ a_y \end{pmatrix} t^2 + \frac{1}{6} \begin{pmatrix} j_x \\ j_y \end{pmatrix} t^3,$$
where $\vec{v}$, $\vec{a}$, and $\vec{j}$ are the velocity, acceleration and jerk, respectively.
Now given a start time $t_0$, I would like to know how long it takes ($\Delta t$) the object to travel a certain distance $d$ along the trajectory. I assume I would need to solve the following integral (or at least something along those lines, pun intended) for $\Delta t$, but I'm not really sure how to do that:
$$d = \int_{t_0}^{\Delta t} |\vec{p}|\text{d}t$$
I would like to do this in Python and I would think that this is a very basic problem, however I haven't found the right tools to do this yet. Numpy for example offers the roots function, and I guess this might help me here, but I haven't figured out how yet.
Also, except for this similar-ish post, I didn't manage to find much on this, probably because I haven't figured out the correct key terms yet.
The correct formula for the distance is $$ d=\int_{t_0}^{\Delta t}\Big|\frac{d}{dt}\vec p\Big|\,dt $$ (add up infinitesimal distances to see this). To invert this in python I would calculate a grid of $d$-values given a grid of $\Delta t$-values and then interpolate reversely: stick the time grid into the slot for the $y$ and the $d$- grid into the slot for $x$ and interpolate to get $\Delta t$ given an arbitrary $d$.