I'm trying to apply nonlinear refinement to a homography matrix using Levenberg-Marquardt optimization. I'm using what's called the "symmetric transfer error", which is defined here for a set of at least four point correspondences $x_i \leftrightarrow x_i'$.
$\sum_{i} ||x_i - H^{-1}x_i'||^2 + ||x_i' - Hx_i||^2$
To my understanding, I'm trying to find 9 parameters of the 3x3 homography matrix $H$ which minimizes the above error. In order to do this programmatically, I was planning on using scipy.optimize.least_squares but I'm having trouble formulating this as a least squares optimization problem. Mainly because it seems as though I have two distinct sets of residuals, $\{x_i - H^{-1}x_i'\} \forall i$ and $\{x_i' - Hx_i\} \forall i$.
In the documentation for the above function, I need to specify a function $f_i(x, a, b, ...)$ which returns a vector of residuals for parameter $x$ and arguments $a, b$, etc. As well as an error function $rho$ which takes in a vector and returns a scalar. I'm guessing my $rho$ is just $||f_i(x)||^2$ but please correct me if I'm wrong.
So, I guess I'm having trouble representing my loss function using a single $f$ and a single $rho$.
The 2 separate residuals for each data point can easily be dealt with. If there are $n$ data points, return a vector of residuals of length $2n$. For each data point, there are 2 corresponding residuals in the vector of residuals. It all works out correctly, because the overall objective is the sum of squares of all the residuals (2 per data point).
As for the loss function rho, the default is 'linear', i.e., loss(x) = x, which produces the standard least squares solution. There is no need to change from the default unless you want to use an optional robust loss function instead of least square, in which case follow the instructions.