Nonlinear regression standard error for two parameters?

45 Views Asked by At

I've been tasked with finding the standard error for two parameters in an equation, but I have no idea about statistics. If my question is answered, please point me in that direction. Otherwise, please help!

The equation is

$$\frac 1 V = \frac{1}{g\cdot A} + \frac 1k.$$

My experiment has values for $V$ and $A$, and I used nonlinear regression in Excel to estimate $g$ and $k$. Now I need the standard error for $g$ and $k$, but I have no idea how to do that. I need to know $g \pm \textrm{something}$. I have $g$, but not the error for $g$. My lab uses a black-box software that spits out values, but the software does not use an equation in the form above. I would appreciate how to do it in Excel, if possible.

Thanks in advance!

1

There are 1 best solutions below

0
On

This is a linear regression $y=mx+c$ with $y:=1/V,\,m:=1/g,\,x:=1/A,\,c:=1/k$. You won't find it hard to find, with either formulae or a software solution, the standard errors $\sigma_m,\,\sigma_c$ of $m,\,c$. The last step you need is $\sigma_{f(z)}=|\partial_z f|\sigma_z$ for differentiable functions $f$ of a random variable $z$, so $\sigma_g=\frac{\sigma_m}{m^2}=g^2\sigma_m$. Similarly, $\sigma_k=\frac{\sigma_c}{c^2}=k^2\sigma_c$.