How to convert spherical coordinate system to Euler angles?

13.9k Views Asked by At

I have a point at the origin of a $3D$ environment and a second point which is free to move along the surface of a sphere. Obviously, the best way to represent the direction of the vector created by those two points (with the origin being the tail) would be to use an azimuth angle and an altitude as show in the following picture (taken from here):

My question is, how do I convert that to Euler angles?

2

There are 2 best solutions below

2
On BEST ANSWER

Usually in spherical coordinates there are two angles, $\theta$ and $\phi$. Start with a point on the $z$-axis. Rotate about the current $z$-axis by $\phi$. Then, rotate about the new $y$-axis by $\theta$. This should be directly convertible to some convention of Euler angles.

If you're working with a Z-X'-Z'' convention, then the only subtlety involved is to line up the $y$-axis with the first rotation instead. This should correspond to a rotation about the $z$-axis by $\phi - \pi/2$ as your first rotation.

0
On

I have been working on a similar project to design a dual-axis, solar tracking parallel manipulator. I was looking for the conversion of azimuth and altitude to Euler X-Y angles conversion, since I only needed these two rotations. I found the equations to be

θx = arctan(y / sqrt(x*x + z*z) )

and

θy = arctan(-x / z)

where

x, y , z

are the initial coordinates of a line in the direction of the give azimuth and altitude angles with a magnitude of one for simplicity

These coordinates can be easily obtained from the given azimuth and altitude angles.

x = cos(azimuthAngle)

y = sin(azimuthAngle)

z = tan(altitudeAngle)