My question is really simple, given two paired sets of points $\{x_i\}$ and $\{y_i\}$ defined in an N-dimensional space $\{(x_1,y_1), (x_2,y_2), ..., (x_n,y_n)\} \in {\rm I\!R}^N \times {\rm I\!R}^N $, it is possible to find the best translation vector and rotation matrix that transforms $\{x_i\}$ to $\{y_i\}$ using the Kabsch algorithm. To my understanding, this algorithm is purely geometrical and does not suppose any distribution for the random variables $X$ and $Y$ generating respectively $\{x_i\}$ and $\{y_i\}$.
Now, given that condition ($\{x_i\}$ and $\{y_i\}$ both have a multivariate Gaussian distribution), is there another version of the Kabsch algorithm that takes such hypothesis into account ? Or is there another technique to find the best translation / rotation vector between data points generated by multivariate Gaussian random variables ?
Any pointer would be of big help. Thanks !
I think I found the answer, there is a paper called Empirical Bayes hierarchical models for regularizing maximum likelihood estimation in the matrix Gaussian Procrustes problem which reformulates the Kabsch algorithm (solving the Procrustes problem) using a ML estimator supposing the data are generated by a Gaussian distribution.