I work with so called "animation curves" in Unity which are basically bézier curves. For an algorithm I need to know the slope of the curve at point t. I already found some solutions on the web for this, but all of them only seem to work as long as t is between 0 and 1.
Animations curves are not bound to these values, they can for example look like this: Example of an animation curve
All points can have x- and y-positions greater than 1 or even negative values.
Is there a way/ algorithm for finding the slope?
Note: I don't speak "mathematic". I was not able to read many formulas I found. The code in the end is going to be in C#. It would be very helpful if you could add a solution that's readable for people without deep knowledge of mathematics. :)
Bonus: In the next step I will have to find the maximum and minimum height of the curve. I can post this question in a new post, but if you feel capable of giving an answer to this question aswell, feel free to add it here.
Thank you!
A Bezier curve is basically a different representation of a polynomial curve. Therefore, you can certainly evaluate the curve at $t$ where $t < 0.0$ or $t > 1.0$. The reason we typically confine the $t$ value within $[0,1]$ is that the curve would be "well behaved" within this range, which means that the curve will stay within the convex hull of the control polygon and will not oscillate more than the control polygon does. Once you consider the portion outside $[0,1]$, these nice properties will no longer hold.
Having said this, even if your animation curve have x-y values greater than 1 or negative values, you do not need to evaluate outside $[0,1]$. You just need to place the control points properly and the animation curve can go beyond $[0,1]$ range (for the x, y values) even if you still evaluate within $[0,1]$.