How bad is polynomial regression with OLS

130 Views Asked by At

Let us look at the following polynomial regression: $$ y_{i} = \beta_{0} + \beta_{1}x_{i} + \beta_{2}x_{i}^{2} + \dots + \beta_{m}x_{i}^{m} + \varepsilon_{i}. $$ Then, OLS solution is given by $$ \hat{\mathbf{\beta}} = (\mathbf{X}^{T}\mathbf{X})^{-1}\mathbf{X}^{T}\mathbf{y} $$ and the covariance matrix is $$ \Sigma= \sigma (\mathbf{X}^{T}\mathbf{X})^{-1}. $$ Therefore, we have to compute the inverse of the product of Vandermonde matrix $\mathbf{X}$.

I can not understand how will it affect the estimator of $\beta$ and the estimator of the covariance matrix $\Sigma$. How big problem is it?