Which is the variance of a variable which is the linear sum of normally distributed random variables?

26 Views Asked by At

I have a random variable $x$ which is normally distributed with expceted value $\bar{x}$ and variance $\sigma$:

$$x\sim N(\bar{x},\sigma)$$

As you know, i can consider $\bar{x}$ a random variable that, being the sum of multiple normally distribuited variables, is normally distributed. Now, in many sources on the internet people start explaining confidence intervals saying that the variance of this new random variable is $\frac{\sigma}{\sqrt{N}}$, where $N$ is the sample size of the initial dataset. Can you derive the math to conclude that the variance of the expected value is equal to the original variance divided by the square root of the sample size?