There was a time where I could figure this out for myself, but my math skills are rustier than I thought, so I have to humbly beg for help. Thank you in advance.
I am solving a weighted nonlinear least-squares problem of the usual form:
$$ \mathbf{\theta}^* = \arg \min_\mathbf{\theta} \sum_i \left[ \frac{y_i-\hat{y_i}\left(\mathbf{\theta}\right)}{w_i} \right]^2 $$
I have programmed an algorithm that can solve this very well*. However, in my particular problem, it is easier** to deal with $y_i^2$ and $\hat{y_i}^2\left(\mathbf{\theta}\right)$.
So my question is: can I modify the weights $w_i \rightarrow w_i^\prime$ so that the following modified forumlation gives the same result as before:
$$ \mathbf{\theta}^* = \arg \min_\mathbf{\theta} \sum_i \left[ \frac{y_i^2-\hat{y_i}^2\left(\mathbf{\theta}\right)}{w_i^\prime} \right]^2 $$
*It is not really relevant to the question, but I am using Levenberg-Marquardt.
**The reason is that my $y$'s are geometrical distances that come from a Euclidean norm. I'm programming a microcontroller, where a square root is a computationally expensive function.
It's not going to work, I think: $$ \mathbf{\theta}^* = \arg \min_\mathbf{\theta} \sum_i \left[ \frac{y_i^2-\hat{y_i}^2\left(\mathbf{\theta}\right)}{w_i^\prime} \right]^2 = \arg \min_\mathbf{\theta} \sum_i \left[ \frac{y_i+\hat{y_i}\left(\mathbf{\theta}\right)}{w_i^\prime}\right]^2\left[ \frac{y_i-\hat{y_i}\left(\mathbf{\theta}\right)}{w_i^\prime} \right]^2 \\= \arg \min_\mathbf{\theta} \sum_i \left[\frac{y_i-\hat{y_i}\left(\mathbf{\theta}\right)}{w_i} \right]^2 $$ Where: $$ w_i = \frac{(w_i^\prime)^2}{y_i+\hat{y_i}\left(\mathbf{\theta}\right)} $$ Unless the weights $w_i$ are allowed to be dependent on $y_i$ and $\hat{y_i}\left(\mathbf{\theta}\right)$.