Calculate rotation of a point set

144 Views Asked by At

I am trying to solve the following problem:

I have a set of 2D coordinates of fingers on the screen (e.g. $x_1$;$y_1$, $x_2$;$y_2$ etc) before the rotation was done. Fingers are moved to the new coordinates ($x'_1$;$y'_1$, $x'_2$;$y'_2$). I'd love to calculate an angle of the performed rotation in the range $[0;2\pi)$

The best I have is the following algorithm:

  1. Calculate the coordinates of the center of the point set (e.g. $\frac{x_1+x_2}{count}$;$\frac{y_1+y_2}{count}$)
  2. Calculate an euler angles between the x-axis and the vectors from the center found on the previous step to every point, so I have 'starting' angles to every point in range $[0;2\pi)$
  3. Every time fingers move, recalculate the angles, substract them from the original angles to get the 'deltas'
  4. Sum all the deltas from the previous step and threat them as the angle of the rotation

The problem is that the rotation appears to be faster in count times, where count is the number of points I have.

I cannot simply divide the deltas because they change from $0$ to $2\pi$ independetly from each other. If I divide $0$ to count, I get $0$, but when I divide $2\pi$ to count, I get, let's say $\pi$ which is not correct.

I suppose I need to normalize the angles to $(-\pi;+\pi)$ at some point to avoid that behavior but I cannot find the correct way of doing this.

Here is what I mean: enter image description here In this particular case I can easily sum angles' deltas and divide them by the number of points, but in common case that division is not possible because of the seam between 2$\pi$ and 0