Can the empirical mean and empirical variance be independent of each other?

1.3k Views Asked by At

Assume $n$ variables $\{X_i\}_{i=1}^n$ are independently drawn from the same Gaussian distribution. Then, we define the empirical mean by $\bar{X}=\frac{1}{n}\sum_{i=1}^nX_i$ and variance by $S^2=\frac{1}{n-1}(X_i-\bar{X})^2$.

My question is that if $\bar{X}$ and $S^2$ are independent of each other? under which condition?

2

There are 2 best solutions below

0
On

The link in my Comment proves that $\bar X$ and $S^2$ are independent for normal data. This is a unique property of the normal family of distributions.

For intuition (only), here are plots of $S$ against $\bar X$ for many samples of size $n = 5$ from the (a) standard normal, (b) standard exponential, and (c) $\mathsf{Beta}(.1, .1)$ distributions, respectively. Each point in each graph represents one sample of size five. Examples (b) and (c) were chosen because they display obvious patterns of dependence. [In (c), $\rho(\bar X, S)= 0,$ but nonlinear association is clear.]

enter image description here

0
On

Under the assumptions you've stated they are independent. Maybe the quickest way to see that is to show that $\overline X$ and the vector $(X_1-\overline{X}, \ldots, X_n-\overline{X})$ are independent.

First find $\operatorname{cov}\left( \overline{X}, X_i - \overline{X} \right).$ You will see that it is $0$. Jointly Gaussian random variables whose covariance is $0$ are independent.