I basically have two coordinate systems that have the same origin, and can measure the coordinates of a vector (but only one) in respect to both of them. I need to calculate the angle between the y axis of the first coordinate system and the y axis of the second.
I was thinking that I should be able to find a way to express the axis of the second system in the first one, and then calculate the angle in that system. I don't really have much knowledge on matrices, so the only part I've been able to work out is the last: if I have the coordinates of the unitary vector for the y axis in the other system, I can use the scalar product to find the angle. But I don't know how to find that unitary vector... Hopefully this makes sense...
Edit: since the question is actually quite generic, I'll try to explain more in depth what I'm trying to do.
I have to measure earth's radius using a camera. I have a magnetometer along with the camera, that measures earth's magnetic field as a vector. The camera points exactly towards the z axis of the magnetometer. The y axis is the vertical one.
Before running my experiment I measure a reference vector for the magnetic field. Then I move my camera up in the sky at a certain (known) height and take a picture of earth. I don't know the orientation of the camera, all I know is the vector for the magnetic field (that I assume, for simplicity, hasn't changed in intensity nor direction in respect to the first one measured), in the new rotated xyz coordinate system. So I have the same vector measured in two different coordinate systems.
The 3d model of my experiment - the blue line is the true horizon line ||Another view of the same representation
I then analyze the picture. Knowing the FOV of the camera, I can extrapolate the angle between the z axis of the rotated (and translated) system and the true horizon line. This is the line the camera "sees", and the angle I can measure by making a proportion between the tangent of the FOV, the pixel distance between the edges of the picture, and the pixel distance between the center of the image and the horizon line. All I need now is to know the angle between the z axis of the rototranslated (?) system and the astronomical horizon line (that being on the plane perpendicular to the radius that goes through my camera). Once I find that I can work out the angle between the true horizon line and the astronomical horizon line, and use some simple trigonometry to find the radius of Earth. What I was thinking about is that the angle between the z axis of the new system and the astronomical horizon is the same if I rotate along the z-axis. I do so to find the theoretical measured vector if the picture was perfectly straight. Logic tells me that that angle, after rotating, is exactly the angle between the original y axis and the just found y axis. But I'm not completely sure. And I don't know how to calculate it.