In Johansen's book 'Likelihood-based inference in cointegrated vector autoregressive models', in order to get the expression for the Granger's representation theorem he claims that:
$$\beta_\bot(\alpha'_\bot \beta_\bot )^{-1} \alpha'_\bot + \alpha (\beta' \alpha)^{-1} \beta' = I \tag{1}$$
Where:
$\alpha$ is $N\times R$, $\text{rank}(\alpha) =R$
$\beta$ is $N\times R$, $\text{rank}(\beta) =R$
$\beta_\bot $ is $N\times (N-R)$, $\text{rank}(\beta_\bot) =N-R$
$\alpha_\bot $ is $N\times (N-R)$, $\text{rank}(\alpha_\bot) =N-R$
$\alpha' \alpha_\bot =0$
$\beta' \beta_\bot =0$
I am not able to prove (1). Can you help me, please?
The stated identity need not be valid under the assumptions provided. Consider for example $\alpha=\beta_{\bot}=\binom{1}{0}$ and $\beta=\alpha_{\bot}=\binom{0}{1}$ corresponding to $N=2$, $R=1$. These vectors all have rank $R=N-R=1$, and they satisfy $\alpha'\alpha_\bot = \beta'\beta_\bot = 0$. But $\alpha'_\bot \beta_\bot=\beta'\alpha=0$, so $(\alpha'_\bot \beta_\bot)^{-1}$, $(\beta'\alpha)^{-1}$ do not exist.
The identity is true, however, if we may assume that the 2-by-1 block matrices $(\alpha,\beta_\bot)$, $(\beta,\alpha_\bot)$ are nonsingular. In that case we have the following derivation: \begin{align} I &=\begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix}^{-1} \begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix} \\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \left[\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix} \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \right]^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \beta'\alpha & \beta'\beta_\bot \\ \alpha_\bot'\alpha & \alpha'_\bot \beta_\bot \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \beta'\alpha & 0 \\ 0 & \alpha'_\bot \beta_\bot \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} (\beta'\alpha)^{-1} & 0 \\ 0 & (\alpha'_\bot \beta_\bot)^{-1} \end{pmatrix}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} (\beta'\alpha)^{-1}\beta' \\ (\alpha'_\bot \beta_\bot)^{-1}\alpha_\bot' \end{pmatrix}\\ &=\alpha(\beta'\alpha)^{-1}\beta' + \beta_\bot(\alpha'_\bot \beta_\bot)^{-1}\alpha_\bot'. \end{align} Thus the identity follows algebraically if we grant that the above block-matrix inverses exist. It may be possible to weaken the premises further.