Levenberg-Marquardt for solving transformation matrix

222 Views Asked by At

Given a set $V$ of $N$ known vectors $\textrm{V} = \begin{bmatrix} x_1 & & x_N\\ y_1 & \cdots & y_N\\ z_1 & & z_N\\ 1 & & 1\\ \end{bmatrix}$ to which a transformation $M$ is applied $ \textrm{M} = \begin{bmatrix} \cos{\psi}\cos{\phi} & -\sin{\psi}\cos{\theta} + \cos{\psi}\sin{\phi}\sin{\theta} & \sin{\psi}\sin{\theta} + \cos{\psi}\sin{\phi}\cos{\theta} & t_x\\ \sin{\psi}\cos{\phi} & \cos{\psi}\cos{\theta} + \sin{\psi}\sin{\phi}\sin{\theta} & -\cos{\psi}\sin{\theta} + \sin{\psi}\sin{\phi}\cos{\theta} & t_y\\ -\sin{\phi} & \cos{\phi}\sin{\theta} & \cos{\phi}\cos{\theta} & t_z \end{bmatrix}$, is it possible to solve for the parameters $\beta = \begin{bmatrix} \theta\\ \phi\\ \psi\\ t_x\\ t_y\\ t_z\end{bmatrix}$ if a noisy observation $V^\prime$ of the resulting transformation is obtained by iteratively applying the Levenberg-Marquardt algorithm? What would be the form of the Jacobian $J$? Would the partial derivative of each row of $M$ be taken independently since each term in $V^\prime$ is a linear combination of $V$ and $M$?

1

There are 1 best solutions below

5
On BEST ANSWER

You have:

$$ \Lambda(\beta) =(V'-M(\beta)V) $$

With $\Lambda$ denoting the "error" matrix. Then: $f(\beta)=\text{trace}(\Lambda^T \Lambda)$ is a scalar you want to minimize.

So you can just use: $$ \beta_{k+1} = \beta_{k}-\frac{\nabla f}{||\nabla f^||^2}f(\beta_{k}) $$

In this case: $$ \nabla f = \begin{bmatrix} \frac{\partial f}{\partial\theta}\\ \frac{\partial f}{\partial\phi}\\ \frac{\partial f}{\partial \psi}\\ \frac{\partial f}{\partial t_x}\\ \frac{\partial f}{\partial t_y}\\ \frac{\partial f}{\partial t_z}\end{bmatrix} $$

To use Levenberg-Marquart you would need to propose a different funcction to minimize, which could be done with a similar logic.