Is there an unbiased estimator of the reciprocal of the slope in linear regression?

817 Views Asked by At

I have a situation which can be handled well through a simple linear regression model. That is, I have data points with known x values, y values with a given amount of error, and an ideal fit of the form $y = \alpha + \beta x$. It's easily possible to get unbiased (but correlated) estimators for $\alpha$ and $\beta$ through linear regression, but I have a case where it would be useful to have an unbiased estimator of $\beta^{-1}$, and I haven't been able to figure out if this is possible.

Some things I've tried which don't work:

  1. Using the inverse of the estimator, $\hat{\beta}^{-1}$. This can be shown through Taylor expansion to be biased. Eg. if $\beta > 0$, then this will be biased high proportional to $\sigma^2 (\hat{\beta})$ to lowest order.
  2. Performing the inverse linear regression, instead trying to fit $x = -\alpha/\beta + y/\beta$ for $-\alpha/\beta$ and $\beta^{-1}$. The problem here is that this breaks the assumption going into the linear regression model, that error is on the dependent variable only. Instead of getting an unbiased estimator of $\beta^{-1}$, you end up getting an unbiased estimator of $\beta\sigma^2(x)/\sigma^2(y)$.

So, is there any known way to do this?