Consider the following linear regression model:
$$y=X \beta + \epsilon = X_1 \beta_1 + X_2 \beta_2 + \epsilon $$
Where we have $n$ obsevations and $k$ variables, and hence $X$ is a matrix $nk$, and $X_1$ and $X_2$ are $nk_1$ and $nk_2$ respectively. Where $k_1+k_2=k$. I am studying the Frisch-Waugh theorem, wich states that the fitted values $\hat{\beta_1}$, and the residuals, obtained by performing the OLS regression are the same for $$y= X_1 \beta_1 + X_2 \beta_2 + error $$ and,
$$M_{X_2}y=M_{X_2} X_1 \beta_1 + error $$
Where, $M_x$ denotes the following symmetric matrix obtained by the orthogonal projector $P_X=X(X'X)^{-1}X'$;
$$M_X=I_n-P_X$$
I would like to prove that the fitted value $\hat{\beta_1}$ is also the same when running the regression $$y=M_{X_2} X_1 \beta_1 + error $$
But so far haven't been able to prove it succesfully. What would you suggest?
You need to start by knowing the following result: If $X_1$ and $X_2$ are orthogonal (i.e., $X_1'X_2 = 0$), then we could get $\beta_1$ by regressing $y$ on $X_1$ alone and $\beta_2$ by regressing $y$ on $X_2$ alone. Please try to figure out why this is so.
Now, in the general case, rewrite the regression equation as $$y = M_{X_2}X_1\beta_1 + X_2\gamma + error,$$ where $\gamma=\beta_2+(X_2'X_2)^{-1}X_2'X_1 \beta_1$. Now you can use the result for orthogonal sub-matrices.