Calculating the angle between a position and momentum vector in spherical polar coordinates

205 Views Asked by At

I have some point $a$ with coordinates $(r,\theta, \phi)$. I can define the position vector from the origin to $a$, simply as $a^i = (r,\theta,\phi)$.

At the same point $a$, there is also some momentum vector $p_i = (p_r, p_{\theta}, p_{\phi})$.

I want to calculate the angle between these two vectors. IF everything was in Cartesian coordinates I could simply do a dot product and calculate the magnitudes of vectors $a^i$ and $p_i$.

My question is how do I do that in spherical polars?

Clearly, I can convert $a^i$ to Cartesian coordinates simply enough, but how would I do the same with the momentum vector? Alternatively, can it be done just staying in spherical polar coordinates?

1

There are 1 best solutions below

0
On BEST ANSWER

Note that assuming

$$\vec p=p_r\vec e_r+p_{\theta}\vec e_{\theta}+p_{\phi}\vec e_{\phi}$$

we have that $\vec a_i$ is aligned to $\vec e_r$ and therefore the angle between $\vec a_i$ and $\vec p_i$ is equal to the angle between $\vec p_i$ and $\vec e_r$ that is

$$\frac{p_r}{\sqrt{p_r^2+p_{\theta}^2+p_{\phi}^2}}$$