Regression when the variance of the residuals depends on the independent variable

136 Views Asked by At

When the residuals follow a normal distribution, the most likely function that fits the data is found using least squares. In that case:

$y = f(x_i) + r_i, \quad r\sim\mathcal{N}(0, \sigma^2)$

What happens when $r\sim\mathcal{N}(0, \sigma(x)^2)\:$?

1

There are 1 best solutions below

0
On

Then you have heteroscedasticity. The estimated values for the standard errors of the parameters are biased. And additional you can´t use the t-Distribution and the F-Distribution for testing the parameters.