Nonlinear distortion that maintains unit length of vectors

27 Views Asked by At

I have a head-mounted eye tracker that supplies me with unit length vectors which give me the direction of the gaze, but without calibration their reference rotation is arbitrary. To align them to the orientation of the head I have therefore used an optimization to find a rotation matrix $R$ which aligns the normal vectors to the head. In that routine the subject fixates a point while moving their head around a bit, which gives me a bunch of head positions, head rotations and gaze vectors. I then minimize the angular error of the gaze vectors to find the best $R$.

That has worked relatively well but when plotting the data I've found that there is a distortion in the normal vectors so they get less accurate towards the periphery of the field of view. For example, looking straight ahead the error might be low, when looking to further to the left it might be off towards the right, when looking right off towards the left, etc. This seems to be a problem of the gaze estimation algorithm from the eye tracker.

I therefore want to apply a nonlinear distortion to reduce the error of the gaze even further. I can't just apply a quadratic transform on the $x$, $y$ and $z$ components of my normal vectors because then they aren't unit length vectors anymore. The operation that I'm looking for is more like a nonlinearly in- or decreasing rotation outwards to the visual periphery, for the two components elevation and azimuth angle. But I don't know how to formulate that mathematically to add it to the optimization algorithm.

I hope this is halfway understandable, otherwise I'll be glad to supply more details.