I only know four examples of families of distributions with conjugate priors:
- Poisson/gamma
- binomial/beta
- exponential/inverse gamma
- normal with known variance/normal
The Bayesian credibility estimate is
$E[X_{n+1} \mid X_1 = x_1, X_2 = x_2, \ldots, X_n = x_n]$
where $X_1, \ldots, X_n$ are an iid sample from a distribution with an unknown parameter with given prior distribution, all variables are known to have used the same parameter value, and $X_{n+1}$ is another independent variable from the same distribution with the same parameter value. This is not simply $E[X_{n+1}]$ -- because it is known all values used the same parameter value, there is some dependence. They are only conditionally independent.
It can be shown that this expected value is a linear function of $x_1, \ldots, x_n$ when the distribution and prior are in the four conjugate prior families I listed above. It can also be shown that the estimate is not generally a linear function of the data.
My question is, is there a theorem here? Must the existence of conjugate priors lead to the linearity property? If not, can you give an example where this doesn't happen (and in the process, expand my list of conjugate priors!)?
Try $X$ with a uniform distribution on $[0,\theta]$ where $\theta$ has a Pareto distribution with density $\dfrac{\alpha m^\alpha}{\theta^{\alpha+1}}$ when $\theta \gt m$ (and both $m \gt 0$ and $\alpha \gt 0$).
The conditional expectation would then be affected by the maximum of the observations rather than some linear combination of them all.