Given a ray (Origin(x,y) and Direction(x,y)) and a circle (Center(x,y) and Radius) determine if the ray intersects the circle (touches one or two points in the circle).
I've followed this solution (both geometic and analityc) but I can't get it to work on my python code. Anyone can see my mistake?
The code is the following:
def dot(x1, y1, x2, y2):
return (x1*x2) + (y1*y2)
def intersects(x, y, vx, vy, cx, cy, r):
# r is radius, x and y are the ray origin, vx and vy are the ray direction and cx and cy are the circle center
dirx = vx-x
diry = vy-y
Lx = cx - x
Ly = cy - y
tca = dot(Lx,Ly,dirx,diry)
if tca < 0:
return False
Lsq = dot(Lx,Ly,Lx,Ly)
tcasq = tca*tca
dsq = Lsq - tcasq
rsq = r*r
if dsq>rsq:
return False
thc = math.sqrt(rsq-dsq)
t0 = tca - thc
t1 = tca + thc
if t0 < 0:
t0 = t1
if t0 < 0:
return False
return True
I would suggest using this more simple approach rather than the one you're using. You can do this by trying to analytically find the minimum distance between the points on the line and the centre of the circle.
Define the variable $\lambda$ by:
$$ \lambda = - \dfrac{(x-c_x)v_x + (y-c_y)v_y}{v_x^2 + v_y^2} $$
Now use lambda to calculate the variable below:
$$ r^* = \sqrt{(x + \lambda v_x - c_x)^2 + (y + \lambda v_y - c_y)^2} $$
$r^{*}$ is the minimum distance between the "ray" and the centre of the circle. Clearly if $r^* > r$, the line and the circle do not intersect.(You can derive the relation I provided for $\lambda$ by differentiating this equation and setting it to $0$.) Otherwise they intersect. If $r^* < r$ they intersect at two points. Since you're using Python the chances of $r^* = r$ is really low because of rounding errors, so the downside to this method is you likely won't be able to tell if the line is exactly tangent to the circle or not.