Why $\sum_{i=1}^n X_i \sim N(0,n)$ when $X \sim N(0,1)$

84 Views Asked by At

I'm trying to understand part of a solution of a hypothesis testing problem. We have $X_1,..., X_n$ independent random observations over a random variable X. We also have two hypothesises :

  • $H_0: X \sim N(0,1)$
  • $H_1: X \sim N(1,1)$

In the solution there's a statement that:

when $H_0$ is true, $\sum_{i=1}^n X_i \sim N(0,n)$,

and when $H_1$ is true, $\sum_{i=1}^n X_i \sim N(n,n)$.

My question is: Is that true, and if it is how and why it is true?

2

There are 2 best solutions below

0
On BEST ANSWER

It is true.

Suppose that $X_1,\ldots,X_n$ are identically and independently distributed $N(a,b)$ random variables. Then $$ \operatorname E[X_1+\ldots+X_n]=\operatorname EX_1+\ldots+\operatorname EX_n=na $$ using the linearity of the expected value and $$ \operatorname{Var}[X_1+\ldots+X_n]=\operatorname{Var}X_1+\ldots+\operatorname{Var}X_n=nb $$ using the independence. So now we have the expected value and the variance of the sum. However, we still need to show that the distribution is normal. This is done here using several different methods.

0
On

If $X\sim N(\mu,\sigma^2)$, then the MGF of $X$ is given by \begin{equation*} M_{X}(t)=e^{t\mu + \frac{1}{2}t^2\sigma^2}. \end{equation*} If $X_{i}$ having the MGF $M_{X_{i}}(t), i=1,2,\cdots,n$ are independent random variables, then the MGF of the sum $\sum_{i=1}^{n}X_{i}$ is \begin{equation*} M_{\sum_{X_{i}}}(t)=M_{X_{1}}(t)\times M_{X_{2}}(t)\times\cdots M_{X_{n}}(t) \end{equation*} Under $H_{0}$, since $X_{i}\sim N(0,1)$, we have, \begin{equation*} M_{\sum_{X_{i}}}(t)=\underbrace{ e^{ \frac{1}{2}t^2}\times e^{ \frac{1}{2}t^2}\times \cdots e^{ \frac{1}{2}t^2}}_{n\;factors}= e^{ \frac{1}{2}t^2 n} \end{equation*} which is the MGF of $N(0,n)$. In other words, the distribution of $\sum_{i=1}^{n}X_{i}$ is $N(0,n)$. Similarly, it can be shown that, $H_{1}: \sum_{i=1}^{n}X_{i}\sim N(n,n)$.