mean of estimator is the estimator itself

38 Views Asked by At

I've just started reading up on Monte Carlo methods (sparse background in probability and stats back in undergraduate).

Let $\phi(X)$ be a function under a complex distribution $P(X)$ where $X \in \mathbb{R}^{N}$.

The expectation, $\Phi$, to this function $\phi(X)$ is:

$\Phi = E[\phi(X)] = \int d^{N} X P(X) \phi(X)$

Since $P(X)$ is sufficiently complex, an exact method to solving for $\Phi$ is difficult so an approach is to sample from the sufficiently complex distribution $P(X)$ first.

If the sampling is done $R$ times, we have $X^{R}$ so that an estimate to this estimator may be described by

$\hat{\Phi} = \frac{1}{R} \sum_{r=1}^{R} \phi(X^{r})$.

So, the expectation of $\hat{\Phi}$ is $\Phi$

How should I convince myself of the claim in bold? I could only imagine that this is true in the case where $R$ is extremely large so that the variance of $\hat{\Phi}$ decreases. There appears to be more to this claim.

1

There are 1 best solutions below

1
On BEST ANSWER

This is a well-studied and well-known concept in statistical inference.

Suppose you want to estimate a quantity $\phi$, let it be a population parameter of some statistical model with distribution $\mathcal{P}$ (say, the mean) and suppose we know the true value, the $\Phi$.
Say you come up with an estimator $\hat\phi$ based on samples from distribution $\mathcal{P}$.

Then, $\hat\phi$ is an unbiased estimator of $\phi \hspace{1pt}$ if $ \hspace{1pt} \mathbb{E}_{{\mathcal{P}}}\big[\hat\phi\big] = \Phi$.

Relevant links: Estimator, Bias of an estimator, Unbiased Estimator

The sample mean is an unbiased estimator of the population mean: Proof. It is also the minimum variance estimator, i.e. the "best" estimator, but that doesn't pertain to this question.