Calculate distance between a point and a line in polar coordinates

334 Views Asked by At

In a 2d space, given a point in polar coordinates and a line in polar coordinates, how do you calculate the distance?

The single somewhat related question was this, which assumes a point in cartesian coordinates. The solution there is to transform the line to cartesian coordinates. I hope there's a solution without transforming everything to cartesian coordinates.

1

There are 1 best solutions below

0
On

I solved the problem by just assuming a rectangular triangle.

Given a point $P (\rho, \phi)$ and a line $L (r, \theta)$, when searching for distance $d$:

Let's say $r$ is the side $c$ in our triangle, $\gamma = 90°$, $A$ is our origin, so $\alpha = |\phi - \theta|$, $\beta = 90° - \alpha$ and lastly the wanted distance $d$ as $a$ between $A$ and $B$:

$a = c · cos(\beta)$

or using our initial variables:

$d = r · cos(|\phi - \theta|)$

EDIT: I just noticed this only works for lines coming from the origin (which is my use case, but isn't necessarily a complete answer to the question)