Can any posterior follow given an appropriate prior?

192 Views Asked by At

Let's assume we are given some observation $x$, and two distributions $p(x | \theta)$ and $p(\theta | x)$. What are the necessary conditions that allow a prior $p(\theta)$ to exist, such that Bayes' Theorem holds and $p(\theta|x)$ is the correct posterior following $p(\theta | x) = p(\theta) p(x | \theta) / p(x)$?

I have only ever seen Bayes' rule in terms of probability densities but I guess you could think of it in terms of general probability measures $\mathcal{M}(\theta)$ on $\theta$. So for an experiment with fixed observation $x_0$ and likelihood $p(x|\theta)$ we could think of it as an operation $\mathcal{B}_{x_0, p(x_0|\theta)}: \mathcal{M}(\theta) \rightarrow \mathcal{M}(\theta)$ that takes in a prior and updates our beliefs into a posterior. Are there any results what properties this operation has?

Philosophically, I guess this question translates into: Can two persons who agree about the state of the world (i.e. $x$) and the way the world works (i.e. $p(x|\theta)$) come to arbitrary different conclusions (i.e. any posterior $p(\theta|x)$) if they only start with different opinions (i.e. $p(\theta)$) or are there some conclusions that you should never reach regardless of your prior opinion.