Can adding a squared term to your regression increase your heteroskedasticity and Fit?

182 Views Asked by At

Lets say you have some data that has a curve linear relationship between your dependent and independent variables, so you decide to add a squared term to your regression in order to better fit your predictions to the data. I noticed that my heteroskedasticity tests (hettest in stata) were worse after I added the polynomial term? I would have thought fitting a curved line to curved data if anything would always reduce heteroskedasticity, can anyone give me some intuition as why that may not be the case?

1

There are 1 best solutions below

0
On

Assume that the data points are generated by $y_i=\beta x_i + \epsilon_i$ where $E\epsilon_i = 0$ and $E\epsilon_i^2 = \sigma^2$. Hence, if you fit $$ \hat{y} = \hat{\beta}_0 + \hat{\beta}_1 x + \hat{\beta}_2 x^2, $$ then you'll have a quadratic model for linear data which will result in increase of variance as a function of $x$. Where fitting $\hat{y} = \hat{\beta}_0 + \hat{\beta}_1 x$ will result in homoscedastic residuals.