Why don't we include an intercept in a ordered probit model?

49 Views Asked by At

Why don't we include an intercept in a ordered probit model?

I am a little unsure on this question.

We have a ordered probit model with latent variable

$y_{i}^{*} = B_{1}x_{i1} + e_{i}$ where $e\sim N(0,(\exp(\alpha_{0}+\alpha_{1}x_{i1})^{2})$

Furthermore, we have observed dependent variable $y_{i}$

$y_{i} = \left\{ \begin{array}{ll} 0 & -\infty \leq y^{*}_{i}\leq \pi_{1} \\ 1 & \pi_{1} < y_{i}^{*}\leq \pi_{2} \\ 2 & \pi_{2} < y_{i}^{*}\leq \infty \\ \end{array} \right. $

Question: Are the parameters $B_{1}, \pi_{1}, \pi_{2}, \alpha_{0}, \alpha_{1}$ identified? If no, which parameters are not identified, and what restriction would you impose to make the remaining parameters identified?

My thinking was that because we don't have $B_{0}$ in this model, we don't run into the situation where $(\pi_{1}-B_{0})$ and $(\pi_{2}-B_{0})$ so in this case, we can identify $\pi_{1}$ and $\pi_{2}$ because $P(y^{*}_{i} = 0|x_{i1})$, $P(y^{*}_{i} = 1|x_{i1})$, and $P(y^{*}_{i} = 2|x_{i1})$ are all different. Furthermore, $B_{1}$ can be identified from ML estimation. Here, our intercepts are $\pi_{1}$ and $\pi_{2}$. We could also set $\alpha_{1} = 0$ and $\alpha_{0} = 1$.

I am not too sure about my answer, hopefully you guys can help!

1

There are 1 best solutions below

0
On

According to here, because the latent variable is unobserved we have to impose a restriction on either the intercept of the regression line or one of the thresholds.

  • In simple probit, usually the threshold is set to $0$ and the intercept is estimated.

  • In ordered probit, the intercept is usually fixed at $0$ and the thresholds are estimated.

Usually the error term is standard normal, i.e. $y_i^*=B_1x_{i1}+\epsilon$ with $\epsilon\sim N(0,1)$. Thus, $\alpha_0=1$ and $\alpha_1=0$ seems correct and $B_1,\pi_1,\pi_2$ are identifiable.