Why doesn't line fitting seem to work in polar coordinates

203 Views Asked by At

I have 2 points, $(r_1, \theta_1)$ and $(r_2, \theta_2)$. They are plotted and I'm trying to find a curve in the form of $r=\theta\beta_1+\beta_2$ to connect both of them. This is basically performing a line fitting and we can simply solve for $\beta_1$ and $\beta_2$ in the following equation without doing fancy stuff such as projecting the solution back to the solution space:

$$ \begin{bmatrix} \theta_1 & 1 \\ \theta_2 & 1 \end{bmatrix} \begin{bmatrix} \beta_1 \\ \beta_2 \end{bmatrix} = \begin{bmatrix} r_1 \\ r_2 \end{bmatrix} $$

The solution I get is

$$ \begin{align} \beta_1 &= \frac{r_1-r_2}{\theta_1-\theta_2} \\ \beta_2 &= \frac{\theta_1r_2-\theta_2r_1}{\theta_1-\theta_2} \end{align} $$

Everything seems fine if the points are both in right quadrants. If one of them is at the left quadrants, the curve seems off.

enter image description here

This is the tool I'm using to plot the graph.