In $2$D, I have a circular arc with center point $A$ and radius $r$ between the angles $\theta_1, \theta_2$. I want to find the minimal distance between that arc and an arbitrary point $P$.
Currently what I am doing is:
- Translate system such that $A$ lies on origin, i.e. $A \rightarrow \bar 0, P \rightarrow P'$
- Let $\phi$ be the angle of $P'$, if $\theta_1 \le \phi \le \theta_2 $ treat this as distance from circle (Corresponds to $P2$ in the drawing).
- If $P'$ is not between the opposite rays, it is closest to one of the edges of the arc (Corresponds to $P3$ in the drawing).
What I don't know is what to do in the last case, corresponding to $P1$ in the drawing. I'd expect the answer to be the distance from $P$ to $A$ plus the radius but I cannot prove this result to myself.
Am I right and just missing an obvious proof?
