Hypothesis test concerning linear model specification.

51 Views Asked by At

Suppose we are given a set of data, each observation can be expressed as $(x,y) $ for some $x,y\in\mathbb{R}$. Suppose a OLS linear model is generated for this data set. Now, with the underlying assumption of the real relationship is indeed linear, we can do hypothesis test on the parameters $\hat{\beta_i}$ for all $i$ and we can also do hypothesis test on the expected value of the dependent variable for a given set of independent variable.

My question is is there any way to do a hypothesis test to show whether the dataset fit a linear model or not. Or a more specific question is, can we do a hypothesis test to show wether the dataset fit the linear model $\mathbb{E}[Y]=\beta_0+\beta_1X$ or not.

1

There are 1 best solutions below

0
On BEST ANSWER

Use Ramsey's (1969) RESET test -- 1. Run the linear model, save fitted values (y^) 2. Regress y on x, as well as (y^) squared and (y^) cubed. 3. Use an F-test to test the significance of (y^) squared and (y^) cubed. 4. If they're jointly significant (i.e. null hypothesis rejected) the model is misspecified and doesn't fit the linear model.