Derivation of statistical test for equality of two regression slopes

73 Views Asked by At

I have some observations $(y_{1i}, x_{1i})$, $i = \{1,2...,n_1\}$ and $(y_{2j} , x_{2j})$, $j = \{1,2...,n_2\}$. The covariates $x_{1i}$ and $x_{2j}$ are fixed, the $y_{1i}$ and $y_{2j}$ are values of independent, normally distributed r.v.’s $Y_{1i}$ and $Y_{2j}$. Simple linear regressions are appropriate models:

$y_{1i} = \alpha_0 + \alpha_1 x_{1i} + \epsilon_{1i}$

$y_{2i} = \beta_0 + \beta_1 x_{2i} + \epsilon_{2i}$

Also we have $Var(\epsilon_{1i} )= Var(\epsilon_{2j})$.

I would like test for $H_0: \alpha_1 - \beta_1 = 0$.

I found for example here that: $ \frac{\alpha_1-\beta_1}{\sqrt{s_{b1}^2+s_{b2}^2}} \sim t(n_1+n_2-4) $

I don't get why, since for a t-distribution we need something like

$T = \frac{V}{\sqrt{Z/m}}$ with $ Z\sim \chi^2_m$ and $V \sim \mathcal{N(0,1)}$.

But $\alpha_1-\beta_1\sim \mathcal{N(0,1)}$ is not given. Does the test make sense nonetheless?

1

There are 1 best solutions below

2
On BEST ANSWER

Under $H_0$, $\alpha = \beta$ and given that $\epsilon_i$, $i=1,2$ follows a normal distribution you have
$$ \frac{ \hat{\alpha} - \hat{\beta} }{\sigma_{\hat{\alpha} - \hat{\beta} }}\sim N (0,1), $$ where $\sigma_{\hat{\alpha} - \hat{\beta} } = \sigma\sqrt{ \frac{(n_2 - 2) S^2_{x_2} + (n_1 - 2)S^2_{x_1}}{ S^2_{x_1} S^2_{x_2} } }. $ Note that (abusing notation, but you will get the idea) $$ S_{x_1}^2 + S_{x_2}^2 = \sigma^2\left( \frac{\chi^2_{n_1-2}}{n_1-2} + \frac{\chi^2_{n_2-2}}{n_2-2} \right). $$ Recall that for independent r.v.s $\chi^2_{n_1-2} + \chi^2_{n_2-2} = \chi ^2 _{n_1 + n_2 - 4}.$ Try to work out the algebra to show that $$ \frac{ \hat{\alpha} - \hat{\beta} }{\sigma_{\hat{\alpha} - \hat{\beta} }} /\sqrt{\frac{ \chi^2_{n_1 + n_2 - 4} } { n_1 + n_2 - 4} } \sim t_{n_1 + n_2 - 4}. $$