Prove independence of random variable and random vector.

163 Views Asked by At

Consider $X_{1} \dots X_{n}$ are independent normal distributed random variables $N(a,\sigma^{2})$. Let $\bar{X}= \frac{1}{n} \sum X_{i}$.

We want to prove that $\bar{X} $ and $(X_{1} - \bar{X},\dots,X_{n}-\bar{X}) = \bar{Y}$ are independent.

My idea : we may try to prove it using characteristic functions. Consider $(\bar{X}, \bar{Y})$. We may try to show that's a gaussian vector. Then if we show it we may consider covariance matrix and if it diagonal then we prove inderenency.

Or the same way : we may show that $\bar{Y}$ is gaussian vector. Then if we show that $\phi_{(\bar{X},\bar{Y})}=\phi_{\bar{X}}\phi_{\bar{Y}}$ - we will show their independency by Levi's theorem.

By the first idea $(\bar{X},\bar{Y})$ is gaussian vector iff for all $a_{i} \in \mathbb{R}$ we have that $a_{1} \bar{X} + \sum a_{i} (X_{i}-\bar{X})$ is normal r.v. That's true because of independence of $X_{i}$ (also we may prove that $\bar{Y}$ is gaussian vector). Now we need to consider $cov (\bar{X},\bar{Y}) = 0$. That's true because of $cov(X_{i},X_{j}) = 0$.

Am I right ?