Variance of $T_n = \min_i \{ X_i \} + \max_i \{ X_i \}$

239 Views Asked by At

Let $X$ be a random variable with distribution $\operatorname{Unif}(0, \theta)$. Draw IID $X_1, X_2, \ldots, X_n$ and calculate the moment method estimator. Compare it with $T_n = \min_i \{ X_i \} + \max_i \{ X_i \}$.

Attempt: We easily find $$\theta' = \frac {2(X_1 + \cdots + X_n)}{n}. $$ I received the broad question as "compare the variances of the two estimators" at finite or at least asymptotically, but I can't compute $V(T_n)$. Is it possible? I can say $V(T_n) > V(\max_i \{ X_i \})$ because $\min_i \{ X_i \}$ and $\max_i \{ X_i \}$ should be positively correlated, and I'm able to find the distribution of $\max_i \{ X_i \}$ and compute its variance.

Thanks!

3

There are 3 best solutions below

6
On BEST ANSWER

The distribution of either $\max$ or $\min$ is easy; the distribution of $\max+\min$ is more involved.

$$ \max\{X_1,\ldots,X_n\} \le x \text{ if and only if } (X_1\le x\ \&\ \cdots\ \&\ X_n\le x) $$ and by independence, the probability of that is the $n$th power of the probability of $X_1\le x.$ Thus it is $ (x/\theta)^n.$ The density is the derivative of that with respect to $x.$ The density of $\min$ is found similarly. But they are positively correlated.

We have $f_{\min}(x) = \dfrac n {\theta^n} (\theta-x)^{n-1}.$

Let $I= \text{the index $i$ in } \{1,\ldots,n\} \text{ for which } X_i= \min.$ Then \begin{align} \Pr(\max+\min\le x) & = \operatorname E(\Pr(\max+\min\le x \mid \min, I)) \\[10pt] & = \operatorname E(\Pr(\max\le x-\min\mid \min,I)) \\[10pt] & = \operatorname E\left( \left( \frac{(x - \min)-\min}{\theta - \min}\right)^{n-1} \right) \\[10pt] & = \int_0^\theta \left( \frac{x-2u}{\theta-u} \right)^{n-1} \frac n {\theta^n} (\theta-u)^{n-1} \, du \\[10pt] & = \frac n {\theta^n} \int_0^\theta (x-2u)^{n-1} \, du \end{align} and this equals something depending on $x$ and $\theta.$ Diffentiating with respect to $x$ gives the density of $\max-\min.$

The best unbiased estimator of $\theta$ in these circumstances is $\dfrac{n+1} n \max\{X_1,\ldots,X_n\}.$ Since the conditional distribution of $\min$ given $\max$ does not depend on $\theta,$ bringing $\min$ into the estimation process after $\max$ is already there just adds noise and makes the variance bigger.

0
On

The joint distribution of the $i^\text{th}$ and $j^\text{th}$ $(i<j)$ order statistics of $U(0,1)$ has density: $$f(u,v) = n!\frac{u^{i-1}}{(i-1)!}\frac{(v-u)^{j-i-1}}{(j-i-1)!}\frac{(1-v)^{n-j}}{(n-j)!}~.$$ In your case, $i=1$ and $j=n$, and you need to scale by a factor of $\theta$ to get to $U(0,1)$ from $U(0,\theta)$. Now, computing the variance of $U_{[1]}+U_{[n]}$ should be straightforward.

0
On

Comment: A brief simulation of a particular case makes it easy to show graphically the relatively large variability of the method of moments estimator $T_1 = 2\bar X$ and the smaller variability of the UMVUE $T_3=\frac{n+1}{n}X_{(n)},$ while the estimator $T_2 = X_{(1)} + X_{(n)}$ has an intermediate variance (illustrating the findings and comments of @MichaelHardy).

All three estimators are unbiased. Distributions of $T_1$ and $T_2$ are symmetrical with modes $\theta;$ the mode of $T_3$ is substantially above $\theta.$

The figure below is based on $100,000$ samples of size $n=5$ from $\mathsf{Unif}(0, \theta=10).$ Standard deviations are roughly $SD(T_1) \approx 2.6,\; SD(T_2) \approx 2.2,$ and $SD(T_3) \approx 1.7.$

enter image description here