Given a ray and line segment, compute radius of smallest circle satisfying certain criteria

113 Views Asked by At

Given a ray and a line segment, (efficiently) compute the radius of the smallest circle satisfying the following criteria:

  1. The circle contains the origin of the ray.
  2. The center of the circle lies on the ray.
  3. The circle contains an endpoint of the line segment or is tangent to the line segment.
1

There are 1 best solutions below

4
On

How I understand the problem (2 cases):

enter image description here

A ray has equation

 P = P0 + t * d

where P0 is starting point, d is unit direction vector, t is parameter.

Needed circle would have center point

 C = P0 + R * d

where R is circle radius (yet unknown)

Now we need to determine the smallest distance from C to segment AB. enter image description here

The shortest distance from C to AB line is the length of perpendicular from C to AB. Using cross product of vectors:

  Len = AC * (AC x AB) / (|AC|*|AB|)
  SquaredLen = AC^2 * (AC x AB)^2 / (AC^2 * AB^2) = (AC x AB)^2 / AB^2 

 AC x AB = (P0x+r*dx-Ax)*(By-Ay)-(P0y+r*dy-Ay)*(Bx-Ax) = 
           r*(dx*(By-Ay)-dy*(Bx-Ax))+((P0x-Ax)*(By-Ay)(P0y-Ay)*(Bx-Ax)) = 
           r * M + N
 But SquaredLen = r^2, so
 (r * M + N)^2 = r^2 * AB^2
 r^2 * (M^2 - AB^2) + r * 2 * M * N + N^2 = 0

This is quadratic equation for r

Consider also extra cases of too large and to small distances in the first case, and in the second case - when projection of C to AB line lies outside AB segment