sufficient statistics to estimate the unknown parameters

438 Views Asked by At

I am a beginner in statistical inference and am learning sufficient statistics. As far as I know the distributions conditional on the sufficient statistics doesn't depend on the unknown parameters. I am wondering is there any methods we can use sufficient statistics to estimate the parameters? Can anyone give a simple example or recommend some books/papers?

1

There are 1 best solutions below

0
On

It is very broad question such that an appropriate answer would require to go through whole material of a course in statistical inference. However, I'll try to answer in a nutshell and then give references to text books.

Basically, a "good" estimator of a parameter should depend on a sufficient statistic to that parameter. Intuitively, such an estimator would utilize all the available information (in the sense of Fisher Information) in the sample. However, this property alone is not enough to construct a "good" estimator. Some other desired properties maybe of interest, e.g., consistency (convergence of the estimator to the parameter), efficiency (minimizing the squared error) and more. There are some families of estimators that use the notion of sufficient statistics. For instance, Uniformly Minimum Variance Unbiased Estimator (UMVU) is a family of estimators that depend on sufficient statistics. Another such family, is the Maximum Likelihood (ML) estimators. However, there are another acceptable families of estimators that are not necessarily depend on sufficient statistics, e.g., (sample) moments estimator.

Lets take a simple example. Let $X_1,...,X_n\sim Poiss(\lambda)$. The unknown parameter is $\lambda$, which can be interpreted as an average rate of events. So, given that $EX = \lambda$, intuitive estimator would be the sample mean, i.e., $\hat{\lambda} = \bar{X}_n$. Actually, it is easy to show that in this case $\bar{X}_n$ it is the ML, UMVU (e.g., by Lehmann–Scheffé theorem) and the method of moments estimator. As well, it is a function of a (minimal) sufficient statistic (Fisher–Neyman factorization).

However, generally it won't be the case. ML estimators can be practically intractable and UMVU may not exist at all, while moments estimators may have other problems (e.g., inconsistent support).

For further reading, you can start with the classical: Casella, George, Statistical Inference or Asymptotic Statistic of A.W Van der Vaart. But, basically, any undergraduate or graduate level textbook in statistical inference will suffice.