I've been trying to learn a bit about dynamical systems and stability. Many definitions center around the stability at an equilibrium point $x^*$ of the system $x' = f(x)$. Intuitively this makes sense to me since if $x(0) = x^*$ then $x(t) = x^*$ for every $t \geq 0$, and if $x^*$ is asymptotically stable then all paths in a neighborhood around $x^*%$ converge to it in due time. In addition, this has the nice benefit that we may use Lyapunov's work to show stability.
Can this definition be extended to the case where the "attracting" point depends in time (so that $x^*(t)$ is now an arbitrary solution) ? For example, in the below figure let the red line depicts $x^*(t)$, the attracting point at time $t$. Then the blue line shows $x(t)$ approaching $x^*(t)$ at every time (in the sense that the gap between the red and blue never grows).
To be a bit more formal, I want $x(t) = x^*(t)$ for all $t \geq 0$ if $x(0) = x^*(0)$, and if instead we have $\left | x(0) - x^*(0) \right | < \delta$ then for all large enough time values $t \geq t_0$ we have $\left | x(t) - x^*(t) \right | < \epsilon$.
Does such a concept exist in literature? Can this be made into the well-studied case $x^* = 0$ via some appropriate transformation? Can a Lyapunov-type result (possibly time-dependent) exist for such a situation?

As @Aphyd points out, this could be done by making a coordinate transform so that a point moving along the trajectory x(t) becomes stationary, while other points (such as x*(t)) moves with the same velocity relative to x(t). Interestingly some systems remain topologically unchanged under such transformation, such as $x'=x, y'=y$. Most of the time a system's stability or topological structure is completely determined by its nonwandering set, which is just a generalized form of stationary points.