Assume a linear regression model $y=X\beta + \epsilon$ with $\epsilon \sim N(0,\sigma^2I)$ and $\hat y=Xb$ where $b=(X'X)^{-1}X'y$. Besides $H=X(X'X)^{-1}X'$ is the linear projection from the response space to the span of $X$, i.e., $\hat y=Hy$.
Now I have a really weird question.
If $\hat y=Hy$ then \begin{align} \hat y =& HX\beta + \epsilon \\ \hat y =& X\underbrace{(X'X)^{-1}X'X}_{\text{I}}\beta + \epsilon \\ \hat y =& X\beta + \epsilon \\ \hat y =& y \end{align}
What's wrong here?
The problem is that you are not multiplying the error term. It should be like that \begin{equation*} \begin{split} \widehat{\mathbf{y}} & = \mathbf{H}\mathbf{y} = \mathbf{H}(\mathbf{X}\boldsymbol{\beta}+\boldsymbol{\varepsilon}) = \mathbf{H}\mathbf{X}\boldsymbol{\beta}+ \mathbf{H} \boldsymbol{\varepsilon}\end{split} \end{equation*}