This isn't terribly different from from regular conversion to polar, but I'm having trouble adapting it. The results I'm looking for are in the range [0,360) that adhere to the following example data.
x | y | Deg
-------------
0 | 1 | 0 (North)
-------------
1 | 0 | 90 (East)
-------------
0 | -1 | 180 (South)
-------------
-1 | 0 | 270 (West)
x and y are the result of subtracting two coordinates. This is for a program I'm writing, so feel free to break it down into "if this then that" calculations.
This is the same as the usual definition of polar coordinates, except with the roles of $x$ and $y$ interchanged. So: $$ x = r \sin \theta\\ y = r \cos \theta $$ and in the opposite direction: $$ r = \sqrt{x^2 + y^2}\\ \theta = \operatorname{atan2}(x,y) $$ I think that in most programming languages $\operatorname{atan2}$ produces a value in the range $(-\pi,\pi]$, so to convert to degrees you should multiply by $180/\pi$ and add $360$ if the result is negative to get into the range $[0,360)$.