I'm using centripetal Catmull-Rom to interpolate keyframe values for animation. Each keyframe represents an animation time $x$ and a channel value $y$.
I am trying to write a function that is supplied:
- four points $P_0$, $P_1$, $P_2$, $P_3$ whose $x$ values are guaranteed to monotonically increase,
- a knot parameterization of $\alpha$ in the range $[0, 1]$, and
- a time $x=T$ (guaranteed to be in the range $[P_{1x},P_{2x}]$
and returns the $y$ value for the given $x=T$.
My attempted approach was to determine the percent between $P_1$ and $P_2$ as $$pct = \frac{T-P_{1x}}{P_{2x}-P_{1x}}$$ and then use the knowledge that the knots $t_1$ and $t_2$ correspond to the points $P_1$ and $P_2$, and so lerp between them to find the parameter $t$: $$t = (t_2-t_1)*pct + t_1$$ and then finally to use $t$ to calculate the point on the curve.
However, I realized—after coding it—that of course there is a nonlinear relationship between the percent across $x$ and the curve parameterization value; as such, the $x$ returned for the value of $t$ does not match the $x$ passed to my function.
What is the math I need to derive either $t$ or (ideally) $y$ given an $x$ value?
