Hypothesis testing of linear regression model with variable and slope dummy variable

143 Views Asked by At

A researcher wants to explore how people perceive distances in comparison to the actual distance, and see how the presence of contact lenses/no contact lenses affected their performances. The following model was created, where z=1 with contact lenses and z=0 without:

ŷ = [1.05 + (z)(0.21)] * x

The standard errors for the estimated coefficients are 0.357 and 0.032, respectively.

Using this model, test the hypothesis that subjects who wear contact lenses would overestimate the distance between the objects.

I know I have to test for null hypothesis: slope = 1 and alternate hypothesis: slope < 1, but I don't know how to set this up with a variable and a slope dummy variable as well, and also take into account the standard errors of both of the coefficients. Thanks in advance for any help!

1

There are 1 best solutions below

0
On

In case with subjects wearing contact lenses we have the following linear regression: Estimated distance ŷ = 1.26x Real distance y = 1.26x + e = αx +e, where e is an error. Ee = 0.357.

According to the characteristics of classic regression model, you can use Fisher's statistic to find the confidence interval of the coefficient α of your regression. According to the null hypothesis, α ∈ [1;+∞] with confidence level 100%: $\frac{(α-α_{estimated})^T X^T X(α-α_{estimated})}{(pσ^2 )} \sim F(p,N-p)$

So, in this case count: $\frac{(α-α_{estimated})^T X^T X(α-α_{estimated})}{(pσ^2 )} < F_{100}(p,N-p)$

And you'll find the confidence interval of "with lenses distance overestimation" (let it be ${p_1}$). Then find the confidence interval of "with lenses distance underestimation" (let it be ${p_2}$). The probability of your null hypothesis will be fraction: $\frac{p_1}{p_1+p_2}$.