Statistical inference and t-stats?

101 Views Asked by At

I have this linear regression model with an intercept(b0) and 3 variables(b1,b2,b3). Then they drop b2 and b3 and they give a new regression line with a new b0 and b1 and consequently new standard errors for each of them. then they ask me if they were jointly significant at the original equation at 5% and also if including them in the original one greatly affect the trade off between y and b1. I think that I have to run some t test while doing some hypothesis testing but not sure how, any help would be very appreciate it. In case you needed the numbers for the mdel and coefficients let it me know , although Id just prefer you guys to tell me how and do it myself.. :P Thaaaanks a lot

1

There are 1 best solutions below

0
On

You have nested models and can perform Likelihood ratio test or F-test for the hypothesis that coefficients of b2 an b3 are zero. Statistical packages can easily do these for you. You can perform F-test formula by calculating F-statistic as (Residual Sum of Squares for Restricted Model - Residual Sum of Squares for Full Model)* number of parametric resitrictions /(Residual Sum of Squares for Full Model * residual degrees of freedom in full model). Under Null hypothesis, this statistic has F-distribution with first paramater equal to number of parametric restrictions and second parameter equal to residual degrees of freedom in full model.

For example,

regress  price mpg gear_ratio foreign

      Source |       SS       df       MS              Number of obs =      74
-------------+------------------------------           F(  3,    70) =   12.45
       Model |   220938241     3  73646080.4           Prob > F      =  0.0000
    Residual |   414127155    70  5916102.21           R-squared     =  0.3479
-------------+------------------------------           Adj R-squared =  0.3200
       Total |   635065396    73  8699525.97           Root MSE      =  2432.3

------------------------------------------------------------------------------
       price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
         mpg |   -208.786   62.65371    -3.33   0.001    -333.7449   -83.82716
  gear_ratio |  -2706.916   1032.335    -2.62   0.011    -4765.842   -647.9888
     foreign |   3241.702   876.8861     3.70   0.000     1492.808    4990.596
       _cons |   17809.07   2511.765     7.09   0.000     12799.51    22818.63
------------------------------------------------------------------------------

. regress  price mpg

      Source |       SS       df       MS              Number of obs =      74
-------------+------------------------------           F(  1,    72) =   20.26
       Model |   139449474     1   139449474           Prob > F      =  0.0000
    Residual |   495615923    72  6883554.48           R-squared     =  0.2196
-------------+------------------------------           Adj R-squared =  0.2087
       Total |   635065396    73  8699525.97           Root MSE      =  2623.7

------------------------------------------------------------------------------
       price |      Coef.   Std. Err.      t    P>|t|     [95% Conf. Interval]
-------------+----------------------------------------------------------------
         mpg |  -238.8943   53.07669    -4.50   0.000    -344.7008   -133.0879
       _cons |   11253.06   1170.813     9.61   0.000     8919.088    13587.03
------------------------------------------------------------------------------

F-statistic = (495615923 - 414127155)*2/(414127155*70). Check for significance using F(2,70) distribution.

One test of checking whether relation with b1 is affected by b2 and b3 may be to do full regression and then test the hypothesis that coefficient of b1 equals the value of the coefficient in the restricted model.