Why is this estimator with ill-defined moments useful? And why is the Cauchy PV of its expectation integral a reasonable measure of center?

42 Views Asked by At

This question pertains to the paper (available online through JSTOR):

M. H. QUENOUILLE, NOTES ON BIAS IN ESTIMATION, Biometrika, Volume 43, Issue 3-4, December 1956, Pages 353–360, https://doi.org/10.1093/biomet/43.3-4.353

In particular I am interested in the following example given in section 6:

enter image description here

In this example $t_n$ is the maximum likelihood estimator for $1/\mu$, $t_n^\prime$ is a reduced bias estimator for $1/\mu$ using the proposed jackknife method, and the data is distributed according to $x_i\sim\mathcal N(2,1)$. The point of the example was to show the estimator $t_n^\prime$ is able to estimate $1/\mu$ with less "bias" than $t_n$.

Observation:

The estimator $t_n$ can be written as $t_n=1/\bar X_n$ with $\bar X_n\sim\mathcal N(\mu,\sigma^2/n)$. It can easily be shown that $E(t_n)$ is undefined because the integral $E(t_n)=\int_{-\infty}^\infty x^{-1}\phi(x,\mu,\sigma^2/n)\,\mathrm dx$ diverges. Therefore $t_n$ (and $t_n^\prime$) have no expected value and thus one cannot really talk about their biases.

Questions:

  1. Given that $E(t_n)$ is undefined, why is $t_n$ a useful estimator of $1/\mu$? Without a well-defined expected value, does $t_n$ actually estimate anything?
  2. The author does assign a value to $E(t_n)$ in terms of the Cauchy principal value (see bottom of page). Why is the Cauchy principal value a useful measure of center for $t_n$?