Assume a model $y = \beta_0 + \beta_1x + u$. Given a sample $(x_i, y_i)_{i=1}^n$, we can find the OLS estimates of $\beta_1$, $\hat{\beta_1}$. Then suppose that we assume another model $x = \gamma_0 + \gamma_1y + \varepsilon$. Then we also can compute the OLS estimates of $\gamma_1, \hat{\gamma_1}$. And my question is whether $\frac{1}{\hat{\gamma_1}}$ is the unbiased estimator of $\beta_1$? Assuming the models satisfy Gauss - Markov assumptions, i.e.
1) $\mathbb{E}(u|x) = 0,\ \mathbb{E}(\varepsilon|y) = 0$
2) $(y_i, x_{i})_{i=1}^n$ are i.i.d.
3) $\mathbb{E}(y^4) < \infty,\ \mathbb{E}(x^4) < \infty$
4) $u, \varepsilon$ are homoskedastic
5) $u \sim \mathscr{N}(0, \sigma_u^2), \varepsilon \sim \mathscr{N}(0, \sigma_{\varepsilon}^2)$
What have I done:
$\hat{\gamma_1} = \frac{\sum(y_i - \bar{y})(x_i - \bar{x})}{\sum (y_i - \bar{y})^2} = \frac{s^2_{xy}}{s^2_{yy}}$ (where $s^2$ means sample covariance)
$\frac{1}{\hat{\gamma_1}} = \frac{s^2_{yy}}{s^2_{xy}} = \frac{\sum(y_i - \bar{y})(\beta_1x_i + u_i - \beta_1\bar{x} - \bar{u})}{s^2_{xy}} = \frac{\beta_1s^2_{xy} + s^2_{yu}}{s^2_{xy}} = \beta_1 + \frac{s^2_{yu}}{s^2_{xy}}$
And here I got stuck. I have no idea, how to calculate the expectation of the second term (or prove that it has zero, or non-zero expectation). Could you please give me any hints?
Thanks a lot in advance for any help!
This is a biased estimator. If $\beta_1 = g(\gamma)=1/\gamma=$ and $\mathbb{E}[\hat{\gamma}] = \gamma=1/\beta_1$, then $\beta_1 = 1/\mathbb{E}[\hat{\gamma}]$. And due to the fact that $1/x$ is either convex (for $x > 0$) or concave (for $x<0$), $1/\hat{\gamma}$ will be biased upward and downward, respectively. This stems from Jensen inequality, i.e., for $\hat{\gamma}>0$ you have $$ \mathbb{E}g(\hat{\gamma}) \ge g\left( \mathbb{E}\hat{\gamma} \right), $$
and reversed inequality for $ \hat{\gamma}<0$.