Suppose we have a straight line in Cartesian $xyz$-space such that $$ x = x_0 + v_x t, \quad \quad y = y_0 + v_y t, \quad \quad z = z_0 + v_z t$$ The projection of this line onto the $(r,z)$ plane in cylindrical polar coords $(r, \phi, z)$ where $$ r = \sqrt{ x^{2} + y^{2} } $$ yields a parametric curve $r(t), z(t)$.
If the distance $L$ between a point on this curve and some other point in the plane $(r_p, z_p)$ is given by $$ L^2 = (r(t) - r_p)^2 + (z(t) - z_p)^2 $$ Then I'm interested in finding the $t$ which minimises $L$, or the minimum value of $L$ itself.
I think we could solve $$\frac{\partial L^2}{\partial t} = 0$$ for $t$ but as far as I can tell, this will get messy. Does anyone know of a nice way to solve this problem?
Thanks!
Hint:
Your "projection" is the rotation around the $z$ axis that cancels the azimuth. When you rotate and arbitrary line around an axis, you generate an hyperboloid of one sheet, and its section by a vertical plane is an hyperbola.
So you are searching the distance from a point to an hyperbola, which leads to a quartic equation. Don't expect an easier solution.