Unbiasedness of OLS estimators

25 Views Asked by At

I'm trying to understand why the OLS estimators of the following model are unbiased:

$\mathbb{E}(y_i|x_i)=\beta_0+\beta_1(ax_i+b), \epsilon_i=y_i-\mathbb{E}(y_i|x_i), i=1,...,n$, where the regressor $x$ is a random variable with mean $\mu_x$ and variance $\sigma^2_x$. Assume the random error is statistically independent of $x$ and has conditional mean zero and conditional variance $\sigma^2$. The terms $a$ and $b$ are real numbers and $a \ne 0$.

Definitions: $S_{xy}$ is the sample covariance of $x$ and $y$ and $S_{xx}$ is the sample variance of $x$.

My attempt

Because the regressor $x$ is a random variable with mean $\mu_x$, the sample mean $\overline{x}$ is also a random variable. The sample mean is an unbiased estimator of $\mu_x$. Assuming $\mathbb{E}(x_i)=\overline{x}$, the transformation $x_i \to ax_i+b$ implies that $\overline{x} \to a\overline{x}+b$. In a simple linear model the OLS estimator for $\beta_1$ is $\hat{\beta_1}=\frac{S_{xy}}{S_{xx}}$, thus substituting the transformed variable $ax_i+b$ for $x_i$ and $a\overline{x}+b$ for $\overline{x}$ into the expression for $\hat{\beta_1}$ gives $\hat{\beta_1}=\frac{1}{a} \cdot \frac{S_{xy}}{S_{xx}}$. But then $\mathbb{E}(\hat{\beta_1})=\mathbb{E}(\frac{S_{xy}}{aS_{xx}})=\frac{1}{a}\beta_1$?