Given a ray and a line segment, (efficiently) compute the radius of the smallest circle satisfying the following criteria:
- The circle contains the origin of the ray.
- The center of the circle lies on the ray.
- The circle contains an endpoint of the line segment or is tangent to the line segment.
How I understand the problem (2 cases):
A ray has equation
where P0 is starting point, d is unit direction vector, t is parameter.
Needed circle would have center point
where R is circle radius (yet unknown)
Now we need to determine the smallest distance from C to segment AB.
The shortest distance from C to AB line is the length of perpendicular from C to AB. Using cross product of vectors:
This is quadratic equation for r
Consider also extra cases of too large and to small distances in the first case, and in the second case - when projection of C to AB line lies outside AB segment