Conceptual question about estimators

71 Views Asked by At

A random number generator produces uniformly distributed random numbers on the intervall $[0,a]$ where $a>0$ is unknown. We can draw $n$ independent $\mathcal U_{[0,a]}$ random numbers $X_1,...,X_n$ for the estimation. Find a mathematical formulation for the following estimators and check if they are biased and consistent.

  1. "Twice the arithmetic mean of the observations"
  2. "The maximum of the observations"

We just started estimators and I am having difficulties getting to the mathematical formulation and showing if they are consistent/bias.

My attempt:

The definitions:

  • An estimator is unbiased if $E[\hat\theta_n]=\theta$
  • An estimator is consistent if $\forall\epsilon>0$, $\lim_{n \to \infty}\Pr(\vert\hat \theta_n-\theta \vert>\epsilon)\to0$

Calculations:

  1. Based on the hint and answer from Milten I want to estimate $\alpha. $Let $\hat\theta=2\bar{X}_n$ then:

$$\begin{split} \implies E[\hat \theta]-\alpha&= E[2\bar{X}_n]-\alpha\\&=E\left[\frac{2}{n}\sum_{i=1}^n X_i \right]-\alpha\\ &=\frac{2}{n} \sum_{i=1}^nE[X_i]-\alpha \\ &= \frac {2}{n}n \frac{\alpha}{2}-\alpha \\ &=\alpha-\alpha=0\checkmark \end{split}$$

My question:

I don't understand what $\theta$ is supposed to be here. Am I trying to estimate twice the mean with the sample I have drawn? Or is $\theta$ supposed to be the mean? More generally, I don't quite get the relationship between $\hat \theta$ and $\theta$.

1

There are 1 best solutions below

8
On BEST ANSWER

Hint:

$\theta$ is the parameter we wish to estimate (i.e. the true, unknown value), and $\hat \theta$ is the estimator, i.e. the approximation of the parameter. In your case $\hat \theta$ will be the two estimators given.

So what paramter in the set-up do you think the two estimators will approximate?

Example: Say $X_1,\ldots, X_n$ were i.i.d. Bernoulli distributed with $P(X_i=1)=p$. Then the observed mean would be an estimator for $p$, and it would be unbiased since $E(\bar X)=p$. So here $\theta=p$ and $\hat\theta=\bar X$.

EDIT: I'll add a bit about esimators in general. There are often several possible choices of estimators for the same parameter, like in this problem. Another common example is uncorrected/biased sample variance (which is an MLE) versus corrected/unbiased sample variance. In theory you could declare any function $(\mathbb R_{\ge0})^n \to \mathbb R_{\ge0}$ an estimator for $\alpha$ (e.g. you could always guess $\alpha=0$), but we of course want estimators that approximate the parameter in some sense. Hence the special names for unbiased/consistent/etc. estimators.

For consistency, Chebyshev or Law of Large Numbers (if you have it available) will work for the first estimator. For the second one, I'd calculate $$ P(|\alpha - \max(X_i)| > \varepsilon) = P(\max(X_i) < \alpha - \varepsilon) $$ directly.