Find UMVU estimator for $\frac{\mu }{\sigma}$

1.4k Views Asked by At

Let $X_{1},\cdots,X_{n}$ be random samples which has normal distribution $N(\mu,\sigma^{2})$.

When $\mu$ and $\sigma^{2}$ are unknown, I want to find UMVU estimator for $\frac{\mu}{\sigma}$.

I know that $(\sum_{i=1}^{n}X_{i},\sum_{i=1}^{n}X_{i}^{2})$ is complete sufficient statistic for $\left(\mu,\sigma^{2}\right)$.

Let $\bar{X}=\frac{1}{n}\sum_{i=1}^{n}X_{i}$, $S^{2}=\frac{1}{n}\sum_{i=1}^{n}X_{i}^{2}$. Then $E(\bar{X})=\mu$ and $E(S^{2})=\sigma^{2}+\mu^{2}$.

So for $\frac{\mu}{\sigma}$, I guess as a estimator $$ Y^{2}=\frac{\bar{X}^{2}}{\frac{1}{n}\sum_{i=1}^{n}X_{i}^{2}-\bar{X}}. $$ But how can I compute $E(Y^{2})?$

2

There are 2 best solutions below

2
On

Comment. It seems to me you may not be on a useful track. Maybe this will help.

Let $\bar X$ be the sample mean, $S^2$ be the sample variance, and $\tau = \mu/\sigma.$ Then by the method of moments it might be reasonable to look at $\hat \tau = \bar X/S$ as an estimate of $\tau$ based on sufficient statistics. One would not necessarily expect it to be an unbiased estimator (perhaps asymptotically unbiased), but might check to see how biased it is. Because $S$ is known to be slightly negatively biased for $\sigma$ it is not surprising that $\hat \tau$ is slightly positively biased for $\tau.$

I do not want to claim this is the path you should take, but I think it may make more sense than what you have suggested. (Your denominator does not seem feasible because your $S^2$ and $\bar X$ have different dimensionalities.)

With a quick simulation of a million samples of size $n = 10$ from $Norm(\mu = 100, \sigma=20)$ (so that $\tau = 5$), in R statistical software, we can investigate this idea. At the end of my simulation I have shown a modification of your estimator that may have promise.

m = 10^6;  mu = 100; sg = 20;  n = 10
x = rnorm(m*n, mu, sg);  DTA = matrix(x, nrow=m)              # each row a sample
a = rowMeans(DTA);  s = apply(DTA, 1, sd);  tau.hat = a/s     # m-vectors
mean(a);  sd(a);  mean(s);  sd(s)
## 99.99136  # exact is 100
## 6.313811  
## 19.45392  # as known, S slightly negatively biased for sg
## 4.640816
mean(tau.hat)
## 5.469827  # not surprisingly, somewhat positively biased for tau

# investigating your estimator
SS = rowMeans(DTA^2)            # your 'S^2'
your.est = a^2/(SS - a)
mean(your.est)
## 0.9745633                    # nowhere near tau
alt.est = sqrt(a^2/(SS - a^2))  # possible candidate
mean(alt.est)
## 5.765703                     # biased, but in the 'ballpark'
1
On

Qwerty's answer is plain wrong, and BruceET's answer is incomplete. I will leave the computations to you but a reasonable guess for UMVUE of $\mu/\sigma$ would be $T=C\dfrac{\overline{X}}{S}$ where $C$ is a constant to be computed by you.

First note that UMVUE's are based on complete sufficient statistics (if they exist), and since $\overline{X}$ and $S^2$ are complete and sufficient, if you can find $C$ that makes $E(T)=\mu/\sigma$, then $T$ is going to be your UMVUE.

So how to compute $C$? Note, $E(T)=CE(\dfrac{\overline{X}}{S})=CE(\overline{X})E(\dfrac{1}{S})$ since for Normal distribution, $\overline{X}$ and $S$ are independent.

You know $E(\overline{X})=\mu$. Now here's a trick to compute $E(\dfrac{1}{S})$.

If $R^2\sim \chi^2_k$ then you can explicitly calculate the pdf of $\dfrac{1}{R}$, which is called inverted Gamma, and once you find the pdf of $\dfrac{1}{R}$ you can easily compute $E(\dfrac{1}{R})$ by working out an integral.

Note here that $nS^2/\sigma^2\sim \chi^2_{n-1}$. Hence you can find $E(1/S)$ and putting $E(T)=\mu/\sigma$ you can therefore find $C$.