Sample standard deviation relationship to true standard deviaton

77 Views Asked by At

If I have a set of observations $x_1, ..., x_n$ which come from the distribution

$X_i \sim N(\mu, \sigma^2)$ independently

then why is the standard deviation of the sample

$\sigma/\sqrt{n}$

This is a result I have been told by a lecturer but I'm not sure why its true. Just to clarify, I am NOT asking about the standard deviation of the sample mean here.

1

There are 1 best solutions below

1
On

Let x1, x2, x3,... be iid r.v and X i ∼N(μ,σ^2 ) (x1+x2)/2 has variance cov((x1+x2)/2,(x1+x2)/2)= 1/4var(x1)+1/4var(x2)+2cov(x1/2,x2/2) where 2cov(x1/2,x2/2) = 0 ,then cov((x1+x2)/2,(x1+x2)/2)= 1/4var(x1)+1/4var(x2)=1/2 (σ^2 + σ^2). We take the square root of it and get σ for distribution of (x1+x2)/2 = σ/square root of 2. Same thing happens with (x1+x2+...+xn)/n(ommited), the result follows that its sd is σ/square root of n. This proof requires theorems that x1+x2 follows normal distribution ~ N(2μ,2σ^2) sorry for not providing it, and for writtig in non-math format for symbols as I am new on this site.