Subtracting vectors in spherical coordinates

19 Views Asked by At

If I have a vector defined relative to the origin with coordinates $(r, \theta, \phi)$, I can give its location with the vector

$$\vec{r} = r\sin{\theta}\cos{\theta} \hat{x} + r\sin{\theta}\sin{\theta} \hat{y} + r\cos{\theta}\hat{z}$$

If I substitute in the cartesian unit vectors written in terms of spherical unit vectors,

$$\hat{x} = \sin{\theta}\cos{\phi}\hat{r} + \cos{\theta}\cos{\phi}\hat{\theta} - \sin{\phi}\hat{\phi}$$ $$\hat{y} = \sin{\theta}\sin{\phi}\hat{r} + \cos{\theta}\sin{\phi}\hat{\theta} - \sin{\phi}\hat{\phi}$$ $$\hat{z} = \cos{\theta}\hat{r} - \sin{\theta}\hat{\theta}$$

I find that the vector reduces to

$$\vec{r} = r\hat{r}$$

It seems to me like we've lost information on the $\theta$ and $\phi$ coordinates in this transformation. Specifically, if I have two vectors $(r, \theta_1, \phi_1)$, and $(r, \theta_2, \phi_2)$, this result would seem to suggest that

$$\vec{r_1} - \vec{r_2} = r\hat{r} - r\hat{r} = 0$$

which is obviously wrong. How do you do vector subtraction in spherical coordinates, and where am I going wrong here?