I'm writing a computer program that has some animations that are calculated with linear interpolation
So the function in question is
$x_1 = x_0 + (1 - x_0) \cdot c \cdot dt$
Where $dt$ is the time for this frame in seconds and $c$ is some constant. This functions will cause $x$ to tend toward $1$, but really it could be any value.
The point is to tweak $c$ in this algorithm so that $x \approx 1$ within some set duration.
The goal would be to have some function $f(T) = c$ where $T$ is the desired duration of this algorithm.
How would you go about solving this?