Why does the use of Jeffrey distributions does not satisfy the likelihood principle?

162 Views Asked by At

It's commonly used as an example, the Bernoulli experiments seen as a binomial and negative binomial random sample, with a posterior distributions $$\Pi _J ^1 (\theta) \propto\theta^ {-\frac{1}{2} }(1- \theta) ^{-\frac{1}{2}} \not\propto \theta ^{-1} (1- \theta) ^{-\frac{1}{2}} \propto\Pi _J ^2 (\theta),$$ with a prior the Jeffrey's distribution, but why Jeffrey's distributions does not satisfy the likelihood principle?

If you could clarify this to me, I would be very thankful!

1

There are 1 best solutions below

0
On

(Your $\Pi_J^1$ and $\Pi_J^2$ are the usual Jeffreys prior distributions in the binomial and negative binomial experiments, respectively, but you've mistakenly called them posterior distributions.)

Let ("binomial") Experiment 1 observe the number of successes in $12$ trials, and let ("negative binomial") Experiment 2 observe the number of trials required to get $3$ successes. Now suppose that the outcome of Experiment 1 is $3$ successes, and that the outcome of Experiment 2 is $12$ trials. Then, for both experiments, the likelihood function is proportional to $$\theta^3(1-\theta)^9. $$ Therefore, according to the Likelihood Principle, these outcomes of the two experiments must lead to exactly the same inferences about $\theta$. But the use of Jeffreys priors violates this, because the Jeffreys prior for Experiment 1 is proportional to $\theta^{-1/2}(1-\theta)^{-1/2}$, whereas the Jeffreys prior for Experiment 2 is proportional to $\theta^{-1}(1-\theta)^{-1/2}$, producing two different posterior distributions (hence different inferences about $\theta$) in the two cases.