I have a random variable simulator with Normal distribution $(\mu,\sigma^2)$. I repeatedly conduction simulation.
Each time, the simulation gives $N$ numbers $x_1,x_2,\ldots,x_N$. I use the $N$ numbers to compute sample mean $\bar x$ and sample standard deviation $s$ and want to see whether sample mean and sample sd can predict true $\mu$ and $\sigma$.
Seems that the error of $m$ measured by $$ \left| \frac{\mu-\bar{x}}{\mu}\right| $$ is larger than that for $s$, measured $$ \left| \frac{\sigma-s}{\sigma}\right| $$ Is it true theoretically? If so, how to prove it?
Or is there any correlated material I can refer to?
Thank you very much!
It seems to me the result is not true. I think you may have tried simulations in which $|\mu|$ is small, $\sigma$ larger. You are dealing with relative errors, which can be large for estimated quantities near $0.$
If I follow correctly what you say, then here is a simulation of it in R based on $B = 100,000$ samples with $n = 10,\; \mu = 1000,\; \sigma = 1.$ I got $MRE(\bar X) < MRE(S).$
Please let me know if I am misinterpreting your question, and I will think about it again in the morning. If not, please try analogous simulations on your own. An enormous advantage of simulation is that you can sometimes save time by not trying to prove something that isn't true.
Note: Perhaps your conjecture is true for normal data when $\mu = \sigma > 0.$ The plot is for $n = 10,\, \mu = \sigma = 1.$ Red lines at expectations.