I have been using the inverse solution of Vincenty's formulae (which I've implemented in Python) to calculate distance and azimuth between sets of coordinates. (The paper I've followed can be found here: http://www.ngs.noaa.gov/PUBS_LIB/inverse.pdf)
This approach works fantastically for distances and is very quick computationally. However I'm not quite sure how to normalise the azimuth angles to between 0-360 deg.
The forward azimuth is calculated by
α1 = atan(cos U2 · sin λ / cos U1 · sin U2 − sin U1 · cos U2 · cos λ)
which I presumed could be converted to degrees and normalised by mod(a, 360).
However, I am trying to achieve the same functionality as the MATLAB function distance and my output varies. For example a sample output of azimuths (where I am converting to degrees and then mod(a, 360)):
[ 309.73989975, 311.54662774, 3.0496471, 279.48361114, 59.50721001, 26.86586812, 28.29347098]
whilst MATLAB (for the same input) gives:
[50.2601 228.4534 176.9504 260.5164 300.4928 153.1341 331.7065]
So MATLAB is computing 360-az? :
Not always it seems:
[50.26010025, 48.45337226, 356.9503529, 80.51638886, 300.49278999, 333.13413188, 331.70652902]
What is the procedure I need to do to get my output equal to MATLAB's? For the above output the original calculated azimuth's, in radians and before and normalisation steps are:
[-0.87720423, -0.84567088, 0.05322638, -1.40527609, 1.03859674, 0.46889786, 0.49381423]
An atan2 type function appears to solve this as evaluating the signs on the arguments allows the correct quadrant to be selected: