Least squares fitting of vectors

95 Views Asked by At

I want to find:

$argmin_{\lambda} = \sum_{i\epsilon I}\left \| \vec{P_{i}} - \lambda\vec{Q_{i}} \right \|^2$

where $P = (x^{'}, y^{'}, z^{'})$ and $Q = (x^{''}, y^{''}, z^{''})$ are representations of 3-dimensional displacements.

I get the general concept of least squares fitting, and I can solve the problem $Af=B$ in the least square sense by doing $A^{t}A\hat{f}=A^{t}B$, but that is just with 2 dimensions.

How would it work if $\vec{P}$ and $\vec{Q}$ are tridimensional vectors?

1

There are 1 best solutions below

6
On BEST ANSWER

The method of least squares works for any dimension (assuming invertibility of $A^TA$). $$ Ax = b\\ A^TA\hat{x} = A^Tb\\ \hat{x} = (A^TA)^{-1}A^Tb $$

For your problem you might do the following: $$ Q_x = \begin{bmatrix}Q_{1x} & Q_{2x} & \dots &Q_{nx}\end{bmatrix}_{1\times n}\\ Q_y = \begin{bmatrix}Q_{1y} & Q_{2y} & \dots &Q_{ny}\end{bmatrix}_{1\times n}\\ Q_z = \begin{bmatrix}Q_{1z} & Q_{2z} & \dots &Q_{nz}\end{bmatrix}_{1\times n}\\ P_x = \begin{bmatrix}P_{1x} & P_{2x} & \dots &P_{nx}\end{bmatrix}_{1\times n}\\ P_y = \begin{bmatrix}P_{1y} & P_{2y} & \dots &P_{ny}\end{bmatrix}_{1\times n}\\ P_z = \begin{bmatrix}P_{1z} & P_{2z} & \dots &P_{nz}\end{bmatrix}_{1\times n}\\ q = \begin{bmatrix}Q_x & Q_y & Q_z\end{bmatrix}^T\ _{3n\times1} \\ p = \begin{bmatrix}P_x & P_y & P_z\end{bmatrix}^T\ _{3n\times1} $$ With such matrices it's easy to solve the problem. We want: $$ q\lambda = p $$ Such $\lambda$ probably doesn't exist. Using least squares we get $\lambda$ that minimizes the square error: $$ \hat{\lambda} = (q^Tq)^{-1}q^Tp = \frac{q\circ p}{q\circ q} $$