How to correct the azimuth obtained from coordinates of two points.

1.4k Views Asked by At

Assuming we have points $A (-0.4664, 0.7104)$ and $B (-1.8112, 2.8032)$. To calculate their azimuth the following equation may be used: $$\text{Azimuth}=\tan^{-1}\frac{X_B-X_A}{Y_B-Y_A}=\tan^{-1}\frac{-1.3448}{2.0928}=-32.72415526582776^\circ$$ However, based on the sign of $X_B-X_A<0$ and $Y_B-Y_A>0$ this angle must be found in quadrant 4 and thus should range between $270$ and $360$ degrees. (The azimuth is measured clockwise from the north or positive y-axis). To correct this angle, we can add/subtract $180$ or $360$ to the calculated azimuth. In this case, adding 360 will solve the issue since $360-32.73272415526582776=327.2672$ degrees.

My question is the following: If for the same example, the calculated azimuth was found to be $32$ degrees. How do we correct this sort of angle? Since adding/subtracting $180$ and $360$, in this case, will not help.

2

There are 2 best solutions below

6
On

I guess you look for the function arctan2(y,x):

from numpy import arctan2
azimuth = - arctan2(-1, 0) + math.pi/2
if azimuth < 0:
    azimuth = azimuth + 2 * math.pi

will give you $\pi$ (x=0, y=-1) and it returns the correct angles from 0 to 360 degrees.

0
On

All mathematical libraries define an azimuth $A$ which is 0 along $+x$ and increasing counter-clock-wise such that it is 90 deg at $+y$. If you're defining your own azimuth convention with $A'=0$ along $+y$ and $A'=90$ deg along $+x$, the relation between the two azimuths is A'= -A+90 deg. So you need to invert the angle obtained with the usual programming languages and add 90 or subtract 270 degrees afterwards.