I am faced with the following problem: Given a simple linear regression model $Y_{i} = \beta_{0} + \beta_{1}x_{i} + \epsilon_{i}$, $\hspace{10mm} i = 1,...,n$, and $\epsilon_{i}$ ~ $N(0, w_{i}\sigma^{2})$, where $w_{i}$ are known weights, and $(x_{i},Y_{i})$ are independent observations, derive the maximum likelihood estimators for $\beta_{0}, \beta_{1},\sigma^{2}$. I was able to solve the problem in the homescedastic case, but for the heteroscedastic above I arrive at solutions $$\tilde{\sigma^{2}} = \frac{1}{n}\sum_{i=1}^{n} \frac{\epsilon_{i}^{2}}{w_{i}},$$ $$\tilde{\beta_{0}} = \sum_{i=1}^{n}\frac{Y_{i}}{w_{i}} - \tilde{\beta_{1}} \sum_{i=1}^{n}\frac{x_{i}}{w_{i}},$$ $$ \tilde{\beta_{1}} = \frac{\sum_{i=1}^{n} \frac{Y_{i}x_{i} - \bar{Y}x_{i}}{w_{i}}}{\sum_{i=1}^{n}\frac{x_{i}^{2} - \bar{x}x_{i}}{w_{i}}}.$$ The answers for $\tilde{\beta_{0}}$ and $\tilde{\beta_{1}}$ seem especially messy. Is there any way to tidy the terms up?
2026-02-22 21:32:53.1771795973
On
Derivation of Maximum Likelihood Estimators for heteroskedasticity case of simple linear regression
438 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Note that $$ \frac{y_i}{\sqrt {w_i}} = \frac{\beta_0}{\sqrt {w_i}} + \beta_1\frac{x_i}{\sqrt {w_i}} + \frac{\epsilon_i}{\sqrt {w_i}} $$ is now homoskedastick.
OLS and MLE do coincide now. You can solve and rescale. For the MLE of $\sigma^2$ it is the one of this reduced model, so it is $$\hat{\sigma}^2 = \sum \frac{( y_i -\hat{\beta}_0 - \hat{\beta}_1 x_i)^2}{ n w_i} \,$$ as stated in the other answer by Vancak.
Not sure about the tidying, but your estimators are wrong. Should be $$ \hat{\beta}_0 = \left( \sum\frac{y_i}{w_i} - \hat{\beta}_1\sum \frac{x_i}{w_i} \right)/\left( \sum \frac{1}{w_i} \right), $$ and $$ \hat{\sigma}^2 = \sum \frac{( y_i -\hat{\beta}_0 - \hat{\beta}_1 x_i)^2}{ n w_i} \, . $$