Let $Y=X\beta+\epsilon$, where $Y$ is an $n$ by $1$ vector, $X$ is an $n$ by $p$ matrix with full rank and $\epsilon$ is an $n$ by 1 vector of random errors independently and normally distribution with mean vector $0$ and variance-covariance matrix $\Sigma=\sigma^2 I$, with $0$ being an $n$ by $1$ vector of zeros and $I$ being the $n$ by $n$ identity matrix. Prove the least square estimator $\hat{\beta}$ and $Y-X\hat{\beta}$ are independent vectors.
I already get $\hat{\beta}=(X^TX)^{-1}X^{T}Y$. But I tried to prove independence, but I cannot.