Let $x, y, z, a, b, c, d$ be vectors, and $t,s$ be scalars.
Let line segments be $y = a + bt, \ t \in [0,1]; \ \ z = c + ds, \ s \in [0,1]$.
The distance $d(z(s), y(t)) = \sqrt{(z(s)- y(t))\bullet (z(s) - y(t))}$. How do I minimize the distance and output two points one for each segment.
If there are an infinite number of solutions, make a "nice-looking" line between the segments, such that if they are paralell and form the sides of a rectangle, then the line is just the line that splits the rectangle. Then choose the output points to be the ends of this beautiful line.
Algorithm? How do I solve this? Calculus?