Assume that I have a scalar variable which takes values on $\mathbb{R}$ and prescribes polar coordinates (in radians) on, say, a unit circle $r=1$. Assume now that I have two such (possibly different) variables $a$ and $b$ on the same circle. I would like to learn their distance $d$ in euclidean space.
The translation into Euclidean coordinates $A,B \in \mathbb{R}^2$ could be done like this:
$$A = [\cos(a)r,\sin(a)r]$$ $$B = [\cos(b)r,\sin(b)r]$$
The distance could then be calculated according to the Pythagorean theorem:
$$d=\sqrt{(\cos(a)r-\cos(b)r)^2+(\sin(a)r-\sin(b)r)^2}$$
Which can further be simplified by pulling out the $r$:
$$d=r\sqrt{(\cos(a)-\cos(b))^2+(\sin(a)-\sin(b))^2}$$
Assuming I made no mistake, this solution seems workable, but not particularly elegant. Is it possible to simplify this further?
$(\cos(a)r-\cos(b)r)^2-(\sin(a)r-\sin(b)r)^2$ inside the square root sign is not correct. It should$(\cos(a)r-\cos(b)r)^2+(\sin(a)r-\sin(b)r)^2$ be
Expanding the squares and using the fact that $\cos^{2} x+\sin ^{2}x=1$ we get $d=r\sqrt {2-2\cos (a-b)}$ since $\cos (a-b)=\cos a \cos b+\sin a \sin b$. You can further simplify this using the formula $1-\cos t=2\sin^{2}(\frac t 2)$. You can now finish the job.