I have $n$ data points all of which are measurements of angles which I know to be increasing linearly with equal spacing. I am trying to find the best fit line for this data. Researching it, it seems the Von Mises distribution is most commonly used for linear-circular data, but I can't find any papers that find the best fit line so I am attempting it myself. This is what I have so far:
We want to fit the data to the equation:
$\theta = \alpha + \beta i$
Since the data is evenly spaced we are using $i$ as the x-values and to make the spacing $=1$ and have the points evenly spaced around $0$
So we want to minimise the mean square error:
$$Q'\left( {\alpha ,\beta } \right) = \sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {{{\left( {\cos \left( {\alpha + \beta i} \right) - \cos \left( {{\theta _i}} \right)} \right)}^2} + {{\left( {\sin \left( {\alpha + \beta i} \right) - \sin \left( {{\theta _i}} \right)} \right)}^2}}$$
$${\left( {\cos \left( {\alpha + \beta i} \right) - \cos \left( {{\theta _i}} \right)} \right)^2} + {\left( {\sin \left( {\alpha + \beta i} \right) - \sin \left( {{\theta _i}} \right)} \right)^2} = 4{\sin ^2}\frac{{\left( {\alpha + i\beta - {\theta _i}} \right)}}{2}\;$$
So we can minimise:
$$Q\left( {\alpha ,\beta } \right) = \sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {{{\sin }^2}\frac{{\left( {\alpha + i\beta - {\theta _i}} \right)}}{2}}$$
So we can set $$\frac{{dQ}}{{d\alpha }} = \frac{1}{2}\sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} {\sin {\left( {\alpha + i\beta - {\theta _i}} \right)}}=0$$
and
$$\frac{{dQ}}{{d\beta }} = \frac{1}{2}\sum\limits_{i = - \frac{{n - 1}}{2}}^{\frac{{n - 1}}{2}} i {\sin {\left( {\alpha + i\beta - {\theta _i}} \right)}}=0$$
and solve for $\alpha$ and $\beta$
But that is as far as I get.
How can we solve these equations given the measurements $\theta_i$?
Note: simple linear regression actually works very well when the differences between successive angles is less than $π$ but it gets trickier when the difference exceeds $π$, especially if the standard deviation of the measurement error is large.
Considering the position error as
$$ \delta_k^2 = \|e^{i(\alpha+(k-1)\beta)}-e^{i\theta_k}\|^2 = (\cos(\alpha+(k-1)\beta)-\cos\theta_k)^2+(\sin(\alpha+(k-1)\beta)-\sin\theta_k)^2 $$
we have
$$ E(\alpha,\beta) = \sum_{k=1}^n(\cos(\alpha+(k-1)\beta)-\cos\theta_k)^2+(\sin(\alpha+(k-1)\beta)-\sin\theta_k)^2 $$
the conditions for minimum are
$$ \cases{\frac{\partial E}{\partial\alpha} = f_1(\alpha,\beta) = 0\\ \frac{\partial E}{\partial\beta} = f_2(\alpha,\beta) = 0} $$
now calling $F = (f_1,f_2)$ we use a Newton-like iteration procedure to solve $F=0$.
$$ \delta X = X-X_0 = -M^{-1}(X_0)F(X_0) $$
with $X = (\alpha,\beta)$ and as initial guess $X_0=\left(\theta_0,\frac{2\pi}{n}\right)$. Follows a MATHEMATICA script implementing this procedure.
NOTE
According to an observation made for @Eric, the simple regression between the angles suffices because the fitting values agree. With
$$ E(\alpha,\beta) = \sum_k(\alpha+(k-1)\beta-\theta_k)^2 $$
$$ \cases{\frac{\partial E}{\partial\alpha} = n\alpha + \frac{(n-1)n}{2}\beta -\sum_k\theta_k = 0\\ \frac{\partial E}{\partial\beta} = \frac{(n-1)n}{2}\alpha+\left(\sum_k (k-1)^2\right)\beta - \sum_k (k-1)\theta_k = 0} $$
NOTE-2
After understanding that the data $\theta_k$ can wind many times the circle, this final version will solve the problem (I hope).
Defining $$E(\alpha,\beta) = \sum_{k=1}^n\left(\alpha+(k-1)\beta -\theta_k\right)^2$$
we have
$$ \cases{\frac{\partial E}{\partial\alpha} = \sum_{k=1}^n\left(\alpha+(k-1)\beta -\theta_k\right)=0\\ \frac{\partial E}{\partial\beta} = \sum_{k=1}^n(k-1)\left(\alpha+(k-1)\beta -\theta_k\right)=0} $$
or
$$ \left(\matrix{n&\sum_{k=1}^n(k-1)\\ \sum_{k=1}^n(k-1)& \sum_{k=1}^n(k-1)^2}\right)\left(\matrix{\alpha\\ \beta}\right)=\left(\matrix{\sum_{k=1}^n\theta_k\\ \sum_{k=1}^n(k-1)\theta_k}\right) $$
which follows in the next script