Why does the following hold for matrices with a row removed

37 Views Asked by At

I'm currently studying OLS and in particular the leverage of individual observations. For this observation matrices are considered (notation $X \in \mathcal{R}^{nxk}$) with observation t removed (notation $X_{(t)}\in \mathcal{R}^{(n-1)xk}$.) Now let $x_t$ denote the $t$th row of $X$. I do not understand why the following holds: $$ X_{(t)}^TX_{(t)} = X^TX - x_t^Tx_t $$

1

There are 1 best solutions below

0
On BEST ANSWER

Let $\{e_i\}$ be the canonical basis for vectors.
Likewise $\{E_{ik}\}$ is the basis for matrices, i.e. $\,\,E_{ik}=e_ie_k^T$

Let's denote the columns of a matrix $X$ with the vectors $\{c_k\}$,
and the columns of $X^T$ (aka the rows of $X$) by the vectors $\{r_k\}$.

Then we can expand $X$ either in terms its rows or its columns
(summation convention over repeated indices) $$\eqalign{ X &= e_ir_i^T = c_ke_k^T \cr }$$ Similarly there are two ways to expand $X^TX$ $$\eqalign{ X^TX &= (e_ir_i^T)^T(e_kr_k^T) = r_ie_i^T e_kr_k^T = \delta_{ik}r_ir_k^T = r_i r_i^T \cr &= (c_ie_i^T)^T(c_ke_k^T) = e_ic_i^T c_ke_k^T = E_{ik}c_i^Tc_k \cr }$$ Let's focus on the row-expansion.

You can move a single term from inside the sum to the outside
(no summation convention)
$$\eqalign{ \sum_{k} r_k r_k^T &= r_t r_t^T + \sum_{k\ne t} r_k r_k^T \cr \sum_{k\ne t} r_k r_k^T &= \sum_{k} r_k r_k^T - r_t r_t^T }$$ which is what we wanted to prove.