Coordinate System transformations as Least Squares based on measured point coordinates

304 Views Asked by At

Given two coordinate systems, $CS_1$ and $CS_2$ and a number of $n$ points $P_i$. $n$ can be anywhere between 8 and 20 for this use-case, located arbitrary. The coordinates (x, y, z) of all points $P_i$ are known in both $CS_1$ and $CS_2$ coordinate systems, but all contain measurement errors.

How can the coordinate system transformation ($x, y, z, \alpha, \beta, \gamma $, translations and euler angles) be calculates between $CS_1$ and $CS_2$? More explicitly, how can a Least Squares or similare problem be formulated in this case and how can it be solved. Explicit, algorithmical answers would be greatly appreciated.

1

There are 1 best solutions below

6
On BEST ANSWER

I suggest the following method (not tested, and certainly not the best, but it's very clear that it should work).

This method allows us to find the position and orientation of the second coordinate system with respect to the first one. It's quite straightforward to obtain the Euler angles from this data.

Let the origin point of CS2 have the following coordinates in CS1:

$$O = (x_0,y_0,z_0)$$

Let the unit vectors of each axis of CS2 have the coordinates in CS1:

$$\vec{u}=(x_u,y_u,z_u)$$

$$\vec{v}=(x_v,y_v,z_v)$$

$$\vec{w}=(x_w,y_w,z_w)$$

Thus, we have $12$ unknowns we need to find. However, obviously, they are connected by the following $6$ equations, reducing the number of unknowns:

$$\begin{cases} x_u^2+y_u^2+z_u^2=1 \\ x_v^2+y_v^2+z_v^2=1 \\ x_w^2+y_w^2+z_w^2=1 \\ x_u x_v+y_u y_v+z_u z_v=0 \\ x_v x_w+y_v y_w+z_v z_w=0 \\ x_w x_u+y_w y_u+z_w z_u=0 \end{cases} \tag{1} $$

Now let each point $P_i$ have the following coordinates in the system CS1:

$$P_i = (x_i, y_i, z_i)$$

And in the system CS2:

$$P_i = (u_i, v_i, w_i)$$

Since this is the same point, and using the definitions of $O, \vec{u}, \vec{v}, \vec{w}$ in CS1, we can write for each $i$:

$$\begin{cases} u_i x_u+v_i x_v+w_i x_w+x_0=x_i \\ u_i y_u+v_i y_v+w_i y_w+y_0=y_i \\ u_i z_u+v_i z_v+w_i z_w+z_0=z_i \end{cases} \tag{2}$$

We obtain a system of $3n$ equations for $12$ unknowns. Because of the measurement errors, they are not consistent. Thus, we can choose to minimize the sum of their squares, i.e. make it a least squares problem.

Also, let's not forget about the $6$ hard constraints $(1)$, which should be fullfilled exactly.

This becomes a constrained optimization problem, which can be easily solved by Lagrange multipliers method. The Lagrange function will have the following form:

$$L=\sum_i (u_i x_u+v_i x_v+w_i x_w+x_0-x_i)^2+ \\ + \sum_i (u_i y_u+v_i y_v+w_i y_w+y_0-y_i)^2 + \\ + \sum_i (u_i z_u+v_i z_v+w_i z_w+z_0-z_i)^2 - \\ - \lambda_1 (x_u^2+y_u^2+z_u^2-1)-\lambda_2 (x_v^2+y_v^2+z_v^2-1) - \\ -\lambda_3 (x_w^2+y_w^2+z_w^2-1) -\lambda_4 (x_u x_v+y_u y_v+z_u z_v) - \\ -\lambda_5 (x_v x_w+y_v y_w+z_v z_w) -\lambda_6 (x_w x_u+y_w y_u+z_w z_u) $$


Since the gradient of $L$ should be zero, we obtain $18$ equations by taking partial derivatives w.r.t. each unknown, including $6$ Lagrange multipliers $\lambda_k$.

$$ \begin{cases} \frac{\partial L}{\partial x_0}=0 \\ \frac{\partial L}{\partial y_0}=0 \\ \frac{\partial L}{\partial z_0}=0 \\ \frac{\partial L}{\partial x_u}=0 \\ \frac{\partial L}{\partial y_u}=0 \\ \frac{\partial L}{\partial z_u}=0 \\ \color{blue}{\cdots} \\ \frac{\partial L}{\partial \lambda_6}=0 \end{cases} $$

The system is non-linear, but it shouldn't be too hard to find the correct solution.

I hope this answer could prove useful in some way.


According to the above, after we find all the unknowns, the coordinate transformations will look the following way:

$$(u,v,w) \to (x,y,z)$$

$$\begin{cases} x=x_u u+x_v v+x_w w+x_0 \\ y=y_u u+y_v v+y_w w+y_0 \\ z=z_u u+z_v v+z_w w+z_0 \end{cases} \tag{3}$$

Of course, the $12$ parameters are not independent, they are connected by the $6$ equations $(1)$, but it doesn't matter, since we explicitly find every one of them, so we can just use them directly.

The systems $(2)$ and $(3)$ are the simple statement that every vector $\vec{r}$ in CS1 can be expressed as a sum:

$$\vec{r}=\vec{r_0}+u \vec{u}+v \vec{v}+w \vec{w}$$

$$\vec{r}=(x,y,z)$$

$$\vec{r_0}=(x_0,y_0,z_0)$$