Showing that an increase in uncertainty is significant

36 Views Asked by At

I have a linear model $y = ax+b$ and I estimate the coefficients $a$ and $b$ in the ordinary way.

I have found out that all of my values of $x$ were systematically overestimated, and also that they have known uncertainties that were heretofore not propagated.

Using total least squares (Deming regression), I find new estimates $\tilde a$ and $\tilde b$. These estimates have a much larger uncertainty ($15\%$ greater), but overlap with the previous estimates of $a$ and $b$.

Since the distributions of the estimates overlap, a referee has said that the change is not significant, and has asked me if I can show that the increase in uncertainty is significant. How can I determine this?

Also, due to regression dilution, the estimated central value of $\tilde a$ is larger than $a$. Is there a one-sided test I can use to show that the means are different, even if their distributions overlap?

Any advice appreciated. Thanks!

1

There are 1 best solutions below

1
On BEST ANSWER

I think you ca use the $F$ test for equivalence of variance. I.e., $H_0: \sigma^2_{a^1} /\sigma^2_{a^0} \le 1$, thus checking the estimated ratio where $$ \hat{\sigma}^2_{a^0} = \frac{MSE}{\sum(x_i - \bar x)^2} $$ against $1-\alpha$ quantaile of $F_{(n_1 -2, n_0 - 2)}$ will give you the answer.