How do I derive the following expression for the sum of orthogonal matrices?

119 Views Asked by At

In Johansen's book 'Likelihood-based inference in cointegrated vector autoregressive models', in order to get the expression for the Granger's representation theorem he claims that:

$$\beta_\bot(\alpha'_\bot \beta_\bot )^{-1} \alpha'_\bot + \alpha (\beta' \alpha)^{-1} \beta' = I \tag{1}$$

Where:

$\alpha$ is $N\times R$, $\text{rank}(\alpha) =R$

$\beta$ is $N\times R$, $\text{rank}(\beta) =R$

$\beta_\bot $ is $N\times (N-R)$, $\text{rank}(\beta_\bot) =N-R$

$\alpha_\bot $ is $N\times (N-R)$, $\text{rank}(\alpha_\bot) =N-R$

$\alpha' \alpha_\bot =0$

$\beta' \beta_\bot =0$

I am not able to prove (1). Can you help me, please?

2

There are 2 best solutions below

3
On BEST ANSWER

The stated identity need not be valid under the assumptions provided. Consider for example $\alpha=\beta_{\bot}=\binom{1}{0}$ and $\beta=\alpha_{\bot}=\binom{0}{1}$ corresponding to $N=2$, $R=1$. These vectors all have rank $R=N-R=1$, and they satisfy $\alpha'\alpha_\bot = \beta'\beta_\bot = 0$. But $\alpha'_\bot \beta_\bot=\beta'\alpha=0$, so $(\alpha'_\bot \beta_\bot)^{-1}$, $(\beta'\alpha)^{-1}$ do not exist.

The identity is true, however, if we may assume that the 2-by-1 block matrices $(\alpha,\beta_\bot)$, $(\beta,\alpha_\bot)$ are nonsingular. In that case we have the following derivation: \begin{align} I &=\begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix}^{-1} \begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix} \\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \left[\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix} \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \right]^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \beta'\alpha & \beta'\beta_\bot \\ \alpha_\bot'\alpha & \alpha'_\bot \beta_\bot \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} \beta'\alpha & 0 \\ 0 & \alpha'_\bot \beta_\bot \end{pmatrix}^{-1}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} (\beta'\alpha)^{-1} & 0 \\ 0 & (\alpha'_\bot \beta_\bot)^{-1} \end{pmatrix}\begin{pmatrix} \beta' \\ \alpha_\bot' \end{pmatrix}\\ &= \begin{pmatrix} \alpha & \beta_\bot \end{pmatrix} \begin{pmatrix} (\beta'\alpha)^{-1}\beta' \\ (\alpha'_\bot \beta_\bot)^{-1}\alpha_\bot' \end{pmatrix}\\ &=\alpha(\beta'\alpha)^{-1}\beta' + \beta_\bot(\alpha'_\bot \beta_\bot)^{-1}\alpha_\bot'. \end{align} Thus the identity follows algebraically if we grant that the above block-matrix inverses exist. It may be possible to weaken the premises further.

0
On

Well I think you can just plugin $\beta_{\perp}$ on both sides of the equation to see that the transformation on the lhs - abbreviated by $A$ - gives $A\beta_{\perp}=\beta_{\perp}$. Likewise, plugging in $\alpha$ on both sides leads to $A\alpha = \alpha$. So the lhs acts as an identity on all the column vectors of the matrix $(\beta_{\perp}, \alpha)$. Hence, if we add the assumption that $(\beta_{\perp}, \alpha)$ is nonsingular, then the lhs acts as the identity for all vectors of a given basis and hence for all vectors.