Assume that $X'_i$s are i.i.d distributed as $N(\mu,\sigma^2)$.
There is a famous theorem that the sample mean $\overline{X}=\frac{1}{n}\sum_{i=1}^n X_i$ and the sample variance $S^2=\frac{1}{n-1}\sum_{i=1}^n (X_i-\overline{X})^2$ are independent.
I know there are already numerous answers about this theorem. Here I want ask a specific method of the proof.
Step1: Let $X=\{X_1,X_2,...,X_n\}'$, $\iota=\{1,1,...,1\}'$, $P=\iota(\iota'\iota)^{-1} \iota'$
Then, $\overline{X}= {1\over n} \iota' X$
Step2: $S^2= {1\over n-1} X'(I-P)X$
Step3: $S^2$ is a function of $(I-P)X$
Step4: $\iota' X$ and $(I-P)X$ are independent
Step5: Therefore, $\overline{X}$ and $S^2$ are independent.
In this method, I can understand step1 ~ step4. However, I cannot understand why step5 holds. Specifically, I don't know the logical bridge between step 3 and step5. Although $S^2$ is a function of $(I-P)X$, since $S^2$ is not multiplied by a constant with $(I-P)X$, can't we say that $\overline{X}$ and $S^2$ are independent just because $\overline{X}$ and $(I-P)X$ are independent?
For $P\equiv \iota(\iota^{\top}\iota)^{-1}\iota$, it's symmetric and we have $P^{2}=\iota(\iota^{\top}\iota)^{-1}\iota=P$, meaning it's idempotent. Using this result, we deduce $(I-P)(I-P)=I-P-P+P^{2}=I-P$, thus $I-P$ is also idempotent.
Therefore, $$S^{2}=(n-1)^{-1}X^{\top}\underbrace{(I-P)}X=(n-1)^{-1}X^{\top}\underbrace{(I-P)(I-P)}X=(n-1)^{-1}\bigl((I-P)X\bigr)^{\top}\bigl((I-P)X\bigr),$$ hence a function of $(I-P)X$.
Update for your question: Note that $(I-P)X$ and $\iota^{\top} X$ are all normal distributed, hence uncorrelated$\iff$independent. $$ \mathrm{Cov}\bigl((I-P)X,\iota^{\top}X\bigr)=\mathbb{E}[(I-P)X(\iota^{\top}X)]-\mathbb{E}[(I-P)X]\mathbb{E}[\iota^{\top} X]=0-(I-P)\mu(\iota^{\top}\mu)=0.$$