Difficulties understanding parametric families of distributions and conjugate

18 Views Asked by At

I'm currently taking a class in Stochastic processes and Bayesian inference, and I'm having a bit trouble understanding conjugate priors and parametric family of distributions.

I've got the following definition:

Let $ \Phi$ be a parametric family. A prior $p(\theta)$ belonging to $\Phi$ is said to be a conjugate for the likelihood $p(x|\theta)$ if and only if the posterior $p(\theta | x)$ belongs to $\Phi$.

My interpretation of this, in as simple terms as possible is:

$\Phi$ is a group of different distributions, such as Gamma-, Beta-distribution. This means that if $p(\theta)\sim \Gamma(\alpha,\beta)$ and $p(\theta | x)\sim\ Beta(\alpha,\beta)$ the definition says that $\Gamma(\alpha,\beta)$ is the conjugate for the likelyhood.

Is my interpretation correct? What does it actually mean that it is a conjugate for the likelyhood? My understanding of likelyhood is that it is just a value?

I do know that these distributions might not be the correct conjugate, but I just want to get the general idea before I dive into the mathematics behind it.

If not, would anyone like to explain this in such simple terms as possible?

Thanks!