In parametric statistical inference, a statistic $T$ of some variable $X$ (thought of as experimental or observational data) is sufficient for the parameter $\theta$ if is captures the essential information in the data $X$ about the parameter $\theta$ (there are several equivalent definitions for this). It is implicitly understood that the random variable $X$ has a probability distribution $f(x;\theta)$, where the parameter $\theta$ is of course unknown.
Related is the concept of a minimal sufficient statistic, which captures nothing more than the essential.
- Is it true that any random variable $X$ as above has a sufficient statistic?
- Suppose $X$ has a sufficient statistic $T$. Must it also have a minimal sufficient statistic?
I'd be glad to have either a concise proof or a simple counter-example for each of these two.
where $\theta > 0, \theta \neq 1$ is the unknown shape parameter, cannot be factorized w.r.t $\theta$.