Writing the position and velocity of one point relative to the other, this becomes finding distance of a point to the origin, where the point is parameterized by a line:
$$s(t) = (x(t), y(y)) = (x_0 + Xt, y_0 + Yt)$$ $$||s||^2 = l^2 = x^2 +y^2$$ $$2ll' = 2xx' + 2yy'$$ $$l' = \frac{xx' + yy'}{\sqrt{x^2 + y^2}}$$ $$l' = \frac{(x_o + Xt)X + (y_0 + Yt)Y}{\sqrt{(x_0 + Xt)^2 + (y_0 + Yt)^2}}$$
Where $x_0, y_0, X, Y$ all constants.
I think my mind is going to mush.
- How do I know this function is defined at the origin?
- How can I evaluate this function at the origin using a computer, where it won't blow up (0/0 = NaN)
The line $s$ passes through the origin if the two moving points coincide at some time $t_0$. In that case, the distance function $l$ will be of the form $k|t-t_0|$ for some constant $k$. This is not differentiable at $t_0$, so there’s nothing strange going on here.