Linear Regression without X? :

364 Views Asked by At

(Have been working in matrix algebra)

Given model:

$ y_i = a + e_i$

( $y_i= α+ϵ_i$ )

That is $y$ subset $i$ and error term subset $i$

Where the expected value of each error term for each entry is $= 0$

Variance(error term for each entry) = $\sigma^2$($x^2$subset $i$)

Is it possible to have a regression model without $X$ in it but the variance of the error term involves $x$?

1

There are 1 best solutions below

0
On

If $y_i=\alpha + \epsilon_i$ and $var(\epsilon_i) = \sigma^2x_i $, then in order to estimate $\alpha$ properly (i.e., in order to satisfy the iid assumption of the noise terms), you should estimate GLS which is WLS in this case. Namely, $y_i^* = y_i/x_i, \quad x_i \neq 0$. Then your model becomes $$ y_i^* = \frac{y_i}{x_i} = \frac{\alpha}{x_i} + \epsilon_i^*, $$ such that $\epsilon_i^* = \epsilon_i/x_i$. Now $var(\epsilon_i^*) = var(\epsilon_i/x_i) = \sigma^2\frac{x_i^2}{x_i^2} = \sigma^2$, hence you can apply OLS on the transformed model and get the BLUE estimator for $\alpha$, however, now you actually involve $g(x_i) = 1/x_i$ in the model.

I don't see why it is impossible to have models like the original one, however, probably in many cases, in order to fit GLS you will finally take the $x_i$'s in a consideration in the estimation procedure.