There are 2 observers both facing the same direction (let's assume positive y-axis in a 3D system). Initially, the observation vectors are parallel. The second observer spots the target. So, in terms of spherical co-ordinates, the observer tilts by angle $\phi$, pans by angle $\theta$ and measures the range of the target from it's position to be $R$. it Now, the first observer is only aware of the following information, the relative position of the second observer from the first observer $(\Delta x, \Delta y, \Delta z)$ and the observation information from the second observer $(R, \theta, \phi)$.
For an example, he first observer is at the origin $(0, 0, 0)$ and the second observer is at the position $(2,0,2)$ and the target is at $(1,1,1)$. Initially, the observers are both looking at the +y-axis.
The second observer tilts down by 45$\circ$ ($\phi=-\pi/4$), and pans left by 45$\circ$ ($\theta=-\pi/4$). The distance between the two points is measured ($R=\sqrt3$).
The first observer is given the following inputs:
- $(\Delta x, \Delta y, \Delta z)$ = $(2, 0, 2)$
- $(R, \theta, \phi)$ = $(\sqrt3, -\pi/4, -\pi/4)$
Based on this information alone, how can I calculate the required pan($\theta^F$) and tilt($\phi^F$) of the first observer's observation vector so that it would be looking directly at the target found by the second observer?