Fitting a function with relative uncertainties

89 Views Asked by At

I want to find the parameters of a function that better fit some measurements. Usually, I use least squares. With it I am assuming that the best function is the most likely one and that the residuals follow a normal distribution which is the same for each measurement independent of its value (i.e., all the measurements have the same absolute uncertainty). Is this ok?

What happens then when all the points have the same relative uncertainty?

I have derived that I still must use least squares and minimize the sum of square (absolute) residuals. I am right?

1

There are 1 best solutions below

0
On BEST ANSWER

If the "relative error" is constant, then the coefficient of variation is constnat for all values of your function.

A general way to model this is $Y = \varepsilon g(\theta,X)$ where $\varepsilon$ is a random variable with pdf $f_{\varepsilon}(x)$ and CDF $F_{\varepsilon}(x)$ where $E[\varepsilon]=1, \sigma[\varepsilon]=\gamma$.

If we let $e_i(Y) = \frac{Y}{g(\theta,X_i)}$ then the above model implies that $F(e_i(Y)|X_i,\theta,g(\theta,X_i)) = F_{\varepsilon}(e_i(Y))$

If you want to do maximum likelihood estimation on this model, then the general formulation is:

$L(\theta|(x_i,y_i),f_{\varepsilon})=\prod\limits_{i=1}^n f_{\varepsilon}(e_i(Y_i))$

We can find the MLE by taking the derivative of the loglikelihood and setting it to 0 to get:

$0=\sum\limits_{i=1}^n -\frac{Y_i \left(\frac{\partial f_{\varepsilon}(z)}{\partial z}|_{z=e_i(Y_i)}\right) \left( \frac{\partial g(\theta,X_i)}{\partial \theta} \right)}{f_{\varepsilon}(e_i(Y_i))g^2(\theta,X_i)}$

If you assume that $\varepsilon \sim \mathcal{N}(1,\gamma)$, then the above formula reduces to:

$0 = \frac{1}{\gamma^2}\sum\limits_{i=1}^n \left[ \frac{\frac{\partial g(\theta,X_i)}{\partial \theta}\cdot Y_i \cdot (e_i(Y_i)-1)}{g^2(\theta,X_i)} \right] $

Whether or not this becomes a least deviation or least squares solution depends on the form of $g(\theta, X)$