Assume $n$ variables $\{X_i\}_{i=1}^n$ are independently drawn from the same Gaussian distribution. Then, we define the empirical mean by $\bar{X}=\frac{1}{n}\sum_{i=1}^nX_i$ and variance by $S^2=\frac{1}{n-1}(X_i-\bar{X})^2$.
My question is that if $\bar{X}$ and $S^2$ are independent of each other? under which condition?
The link in my Comment proves that $\bar X$ and $S^2$ are independent for normal data. This is a unique property of the normal family of distributions.
For intuition (only), here are plots of $S$ against $\bar X$ for many samples of size $n = 5$ from the (a) standard normal, (b) standard exponential, and (c) $\mathsf{Beta}(.1, .1)$ distributions, respectively. Each point in each graph represents one sample of size five. Examples (b) and (c) were chosen because they display obvious patterns of dependence. [In (c), $\rho(\bar X, S)= 0,$ but nonlinear association is clear.]