This arises in multiple linear regression.
Given $m, n \in \mathbb{N}$ and matrices $X \in \mathbb{R}^{m \times (n+1)} (m > n + 1), H = X(X'X)^{-1}X' \in \mathbb{R}^{m\times m}, I = I_m$ and $J \in \mathbb{R}^{m\times m}$, a matrix of $1$'s, how does one show that
$$J(I-H) = 0\text{ ?}$$
I am actually trying to show
$$(I-H)(H-\frac{1}{n}J) = 0$$
which reduces to $J(I-H) = 0$ by noting that $HH =H$.
Apparently, this might have something to do with $J(y - X \hat{\beta}) = 0 \ \forall y \in \mathbb{R}^{1\times(m+1)}$ where $\hat{\beta} = (X'X)^{-1}X'y $?
Is the statement true? If so why? If not, why not, and how then to prove $J(I-H) = 0$ or $(I-H)(H-\frac{1}{n}J) = 0$? I think there's some property of MLR needed.
On the other hand, the equation $J(I-H) = 0$ itself looks like it is independent of statistics. Must one really use some properties of MLR? What are some possible counterexamples?
It is not true that $J(I-H)=0$ unless the rows of $J$, each of which is a row of $1$s, are in the row space of $X'$. That will certainly happen if one of the columns of $X$ is a column of $1$s, as happens in typical multiple regression problems. If you're not using the fact that one of the columns of $X$ is a column of $1$s, or failing that, that a column of $1$s is a linear combination of some of the columns of $X$, then you won't be able to prove this.
Note that $X'H=X'$, and hence for every column $X_j$ of $X$ we have $X_j'H=X_j'$.