When proving $\bar x$ and $S^2$ are independent in my noted it says that "functions of independent quantities are independent ". Can someone tell me how functions of independent quantities are independent happen?
Also let $X_1,X_2,X_3,\dots,X_n$ be a random sample.And suppose we want to estimate parameter $\theta$ using $T(x)$ as an estimator.
In this I have a expression as $E\{ [ T(X)-E(T(X))][E(T(x)-\theta)]\}$.
I want to know if I can say that in $[T(X)-E(T(X))]$ since $E(T(X)$ (expected value is a one particular fixed value) and $T(X)$ are independent of one another.
Also the expression $[E(T(x)-\theta)]$ is independent of $X$.
Hence the two expressions $[T(X)-E(T(X))]$ and $[E(T(x)-\theta)]$ are independent of one another. So that I can use if $a,b$ are independent $E(ab)=E(a)E(b)$ form
I'm not sure I understand your questions, but let's see if this can answer part of it.
$$ \begin{bmatrix} X_1 \\ \vdots \\ X_n \end{bmatrix} = \begin{bmatrix} \bar X \\ \vdots \\ \bar X \end{bmatrix} + \begin{bmatrix} X_1-\bar X \\ \vdots \\ X_n-\bar X \end{bmatrix} $$
Suppose $X_1,\ldots,X_n \sim \mathrm{i.i.d.}\ N(\mu,\sigma^2)$, Then $\bar X$ is uncorrelated with $X_i-\bar X$:
$$ \operatorname{cov}(\bar X, X_i-\bar X) = \operatorname{cov}(\bar X, X_i) - \operatorname{var}(\bar X) = \frac{\sigma^2}{n} - \frac{\sigma^2}{n} = 0. $$
If two random vectors are jointly normal and uncorrelated, then they are independent. Thus $\bar X$ and the last vector on the right side of the equality above are independent. And $S^2$ is a function of that vector, so $\bar X$ and $S^2$ are independent.
Without the assumption that $X_1,\ldots,X_n \sim \mathrm{i.i.d.}\ N(\mu,\sigma^2)$, you won't get all of this.