I want to solve some complicated optimization problem which involves differentiation through Kabsch algorithm. So i need to obtain jacobian vector product and vector jacobian product of Kabsch algorithm with respect to input vertices.
It seems pretty straightforward according to wikipedia. Algorithm outputs two matrices $P, Q$ which represent 3d vertices which were stacked rowwise. $P$ is source points and $Q$ is output points. Algorithm outputs rotation $R$ which brings vertices $P$ to $Q$ in least squares manner. I omit translation optimization and reflection matrix step because i don't need them. Rotation $R$ can be found as:
$$ H = P^T Q\\ U, S, V^T = SVD(H)\\ R = V^TU $$
According to Differentiating the Singular Value Decomposition by James Townsend it's possible to find derivative of SVD, but $dU$ and $dV$ inside expressions (16) and (18) requires computation of matrix $F$ from expression (11)
$$F_{ij} = \frac{1}{s_j^2 - s_i^2}$$ for $i\neq j$ where $s_i$ - singular values, as far as i can understand. Obviously, if our covariance matrix $H$ has equal singular values we have division by zero inside $F$
In practice, $H$ will have equal singular values pretty often. For example in case then $P = Q$. So my question is: how it's possible to differentiate Kabsch algorithm in such cases? Probably it's possible to differentiate some proxy function which is suitable in this situation?
I'm not very good in English and i lack of math skills so i will be very appreciate if somebody point me at my mistakes.
I recently stumbled across that problem as well. This paper seems to be addressing the problem of identical singular values (Section 2.3.1). However I did not implement it, so I can't give you the promise that this will work. Kind regards