Question about Estimators and Notation

27 Views Asked by At

I was studying the notes for my statistics course and I was given the following two results:

bias($\hat\theta$) = $\operatorname{E}({\hat\theta(X_{1},...,X_{n})})-\theta$

and

MSE($\hat\theta$) = Var($\hat\theta$) + (bias($\hat\theta$))$^2$

Where MSE is mean squared error. I was a little confused about what the distinction was (if any) between $\operatorname{E}({\hat\theta(X_{1},...,X_{n})})$ and $\operatorname{E}(\hat\theta)$ and then between Var($\hat\theta(X_{1},...,X_{n})$) and Var($\hat\theta$)?

1

There are 1 best solutions below

0
On

Both notations mean the same thing. $\hat{\theta}(X_{1},...,X_{n})$ just stresses that the estimator is some function of the $n$ random observations $X_{1},...,X_{n}$. For example, if we want to estimate the mean $\theta$ of some iid sample $X_{1},...,X_{n}$ we could use the sample mean as an estimator and write $$\hat{\theta}(X_{1},...,X_{n}):=\hat{\theta}:=\frac{1}{n}\sum_{i=1}^nX_i.$$ Since it is usually clear what the underlying sample is, most of the time people just write $\hat{\theta}$ because it is shorter.