Logistic regression with interactions

204 Views Asked by At

I'm supposed to do a model using logistic regression. So I have a series of $N$ observed data points each of which consists of $m$ explanatory variables $x = (x_{1,i}, ... , x_{m,i})$ and associated binary-valued outcome variable $y_{i}$, which is a realization of random variable $Y_i(x)$ that has alternative distribution with parameter $p_i$. The model assumes, that $$\ln(\frac{p_i}{1-p_i}) = \beta_0 + \beta_1x_{1,i} + \beta_2x_{2,i} + \dots + \beta_mx_{m,i} = (*).$$ I believe that method used for estimating coefficients $\beta_j$ is maximum likelihood and for tests of submodels the likelihood ratio test is used. For assesing goodness of fit Hesmer-Lemeshow test can be used.

So now my question is: do these tests still work the same way if I add interactions to the model? i.e. if $(*)$ equals for example $$\beta_0 + \beta_1x_{1,i} + \beta_2x_{2,i} + \dots + \beta_mx_{m,i} + \beta_{12}x_{1,i}x_{2,i}.$$

1

There are 1 best solutions below

0
On BEST ANSWER

You can use the likelihood ratio test if the models are nested, i.e., if you can obtain one model by setting parameter values $\beta_i\in (-\infty,\infty)$. For example, a model with interactions is nested in a model without interactions, all other covariates equal, because you can just set the coefficients of the interaction terms equal to zero. So yes, the likelihood ratio test is applicable.

I don't know the Hosmer-Lemeshow test, but I see no reason why it should not be applicable. After all, adding interactions is like adding new covariates.

(Watch out for interactions when computing marginal effects though.)