Using log data in parameter estimation

396 Views Asked by At

Say I have data as $x_1,x_2,..x_n$ and I am fitting a model to these data for example, $Y(t)=Y_0 e^{-\lambda t}$. I am trying to estimate the parameter $\lambda$ by fitting to these data.
In these estimation procedures I don't quite understand the meaning of using the logarithm of the data for parameter estimation?

Is there an advantage in using log data when the data vary over a large range, for example from $10^7$ to $1$

  1. When using data without transformation,

     model = @(x,t) data(1)*exp(x(1).*t);
    
    [lambda,error]=lsqcurvefit(model,initial,t,data);
    
  2. If using log data I will call the function as,

     model = @(x,t) log10(data(1))*exp(x(1).*t);
    
    [lambda,error]=lsqcurvefit(model,initial,t,log10(data));  
    

So, the model itself does not change and only the logarithm of the data is taken.

Is there a relationship between the $\lambda$ value that I obtain from the above 2 methods?

Also, which one is actually the correct $\lambda$ value?

When using the log data, should the entire model also be transformed, so that I am fitting to $log(Y)=log(Y_0)-\lambda t$ or can I simply use log of data while the model remains unchanged?

1

There are 1 best solutions below

4
On BEST ANSWER

The model $$Y=Y_0 e^{-\lambda t}$$ is nonlinear but, taking logarithms, $$z=\log(Y)=\log(Y_0)-\lambda t=\alpha+\beta t$$ is linear. So, a first linear regression based on the transformed equation gives $(\alpha,\beta)$. So, you have the estimates $Y_0=e^\alpha$ and $\lambda=-\beta$ and now you can start the nonlinear regression for the true model.

Do not skip the second step since what is measured is $Y$ and not $\log(Y)$.