Convergence rate of mean and standard deviation.

3.2k Views Asked by At

I have a random variable simulator with Normal distribution $(\mu,\sigma^2)$. I repeatedly conduction simulation.

Each time, the simulation gives $N$ numbers $x_1,x_2,\ldots,x_N$. I use the $N$ numbers to compute sample mean $\bar x$ and sample standard deviation $s$ and want to see whether sample mean and sample sd can predict true $\mu$ and $\sigma$.

Seems that the error of $m$ measured by $$ \left| \frac{\mu-\bar{x}}{\mu}\right| $$ is larger than that for $s$, measured $$ \left| \frac{\sigma-s}{\sigma}\right| $$ Is it true theoretically? If so, how to prove it?

Or is there any correlated material I can refer to?

Thank you very much!

2

There are 2 best solutions below

1
On BEST ANSWER

It seems to me the result is not true. I think you may have tried simulations in which $|\mu|$ is small, $\sigma$ larger. You are dealing with relative errors, which can be large for estimated quantities near $0.$

If I follow correctly what you say, then here is a simulation of it in R based on $B = 100,000$ samples with $n = 10,\; \mu = 1000,\; \sigma = 1.$ I got $MRE(\bar X) < MRE(S).$

 B = 10^5;  n = 10;  mu = 1000;  sg = 1
 DTA = matrix(rnorm(B*n, mu, sg), nrow=B)  # each row a sample of size n
 a = rowMeans(DTA)      # vector of B sample means
 s = apply(DTA, 1, sd)  # vector of B sample SDs
 rel.er.a = abs((a - mu)/mu);  mean(rel.er.a)
 ## 0.0002509101        # estimated MRE(sample mean)
 rel.er.s = abs(s - sg)/sg;   mean(rel.er.s)
 ## 0.1880363           # estimated MRE(sample SD)

Please let me know if I am misinterpreting your question, and I will think about it again in the morning. If not, please try analogous simulations on your own. An enormous advantage of simulation is that you can sometimes save time by not trying to prove something that isn't true.

Note: Perhaps your conjecture is true for normal data when $\mu = \sigma > 0.$ The plot is for $n = 10,\, \mu = \sigma = 1.$ Red lines at expectations.

enter image description here

0
On

Let $$ S^2 = \frac 1 {N-1} \sum_{i=1}^N (X_i - \bar X)^2 $$ where $$ \bar X = \frac {X_1+\cdots+X_N} N. $$ Then $$ (N-1) \frac{S^2}{\sigma^2} \sim \chi^2_{N-1}. $$ (How this conclusion is reached has been posted elsewhere within stackexchange, as have proofs that $\bar X$ and $S^2$ are independent.)

Thus the probability distribution of $(N-1) S^2/\sigma^2$ $$ \frac 1 {\Gamma(N/2)} \left(\frac x 2 \right)^{(N/2)-1}e^{-x/2} \,\left(\frac{dx} 2\right). $$ Thus the mean absolute deviation of $S/\sigma$ from $1$ is $$ \int_0^\infty \left| \sqrt{\frac x {N-1}} - 1 \right| \frac 1 {\Gamma(N/2)} \left(\frac x 2 \right)^{(N/2)-1}e^{-x/2} \,\left(\frac{dx} 2\right). $$