Question about unbiased estimators

151 Views Asked by At

We define the unbiased estimator as
Let $X_1,X_2,....,X_n$ be a random sample from a population with the pdf $P_{\theta}$,An estimator $T(X_1,X_2,...,X_n)$ is said to be unbiased for estimating the parameter $\theta$ if $$E(T(X_1,X_2,...,X_n)=\theta$$

What I don't understand is the following example:

if $x_1,x_2,...,x_n$ is a random sample from a normal population $N(\mu,\sigma^2)$ then show that $T=\frac{1}{n}\sum_{i=1}^{n}x_{i}^{2}$ is an unbiased estimator of $\mu^2+1$.
$\textbf{Solution:}$ we have to show that $E(T)=\mu^2+1$
That is $E(\frac{x_{1}^{2}+.....+x_{n}^{2}}{n})=\mu^2+1$.

In the book, I am referring they solve this question by saying that $E(x_1)=\mu$ that is each $x_i$ will follow a normal distribution,My question is how $x_i$ follows a normal distribution?? I mean $x_i$ are not random variables they are just observed values.And why are we taking those values same as the population mean?Please someone explain what mistake I am making here?
{One thing we know that $X_1,X_2,...,X_n$ are the observations of the random variable $X$ for the population,so if we go on a field and collect the data we get $X_1=x_1,....,X_n=x_n$ where $x_1,x_2,...,x_n$ are sample data.}

1

There are 1 best solutions below

0
On

The term "population" should not be taken too literally here. This is a question in Probability theory, which is then used in Statistics. The random variables $\{X_i\}_{i=1}^n$ (using capital letters is a bit better here) form an i.i.d. sample from a $N(\mu,\sigma^2)$ distribution. The word "population" and the lower case $x_i$ are used just so that people will feel this is a question in statistics. But at this stage there is no population, just a distribution. In advanced statistics there are procedures, such as the bootstrap and the jackknife, where one actually resamples from a population.

One small correction: The average $T=\frac{1}{n}\sum_{i=1}^{n}X_{i}^{2}$ is an unbiased estimator of $\mu^2+\sigma^2$, rather than $\mu^2+1$.