Bayesian updating - likelihood

183 Views Asked by At

I have a question about Bayesian updating.

I have a problem where I have an event $A$ with three possible outcomes: $(A(1),A(2),A(3))$. I need to estimate the probability of each outcome.

I am able to define the prior probability of each outcome by using literature, let's say $$P(A(1))=0.8,P(A(2))=0.1,P(A(3))=0.1$$

Then, I can find a pdf for each outcomes by performing some experiments. Each outcome $(A(1), A(2), A(3))$ follows a Gaussian distribution with mean $(m_1, m_2,m_3)$ and standard deviation $(s_1, s_2,s_3)$.

Now, can I update the probability of each outcome by knowing their prior probability and their pdf?

Can I use the pdf as a likelihood and omit the normalizing constant of the traditional Bayes formula?

Thanks.

1

There are 1 best solutions below

1
On

I don't think you specified a prior on the probability of outcomes, you specified them as a constant.

I think what you're saying is you have a mixture distribution with 3 components. You sample $A \sim [.8,.1,.1]$ and the distribution on the data would be, conditional on the sampled $A$:

\begin{equation} \begin{split} X | m,s,A & \sim \sum_{i=1}^3\textrm{I}_{(A=A_i)}\textrm{N}(m_i,s_i)\\ A | \pi & \sim \textrm{Multinoulli}(\pi) \end{split} \end{equation}

Where $\pi$ is the vector of probabilities. Note that if you integrate out $A$, you get:

\begin{equation} X | m,s,\pi \sim \sum_{i=1}^3\pi_i\textrm{N}(m_i,s_i)\\ \end{equation}

Which is just a fully specified mixture distribution. In this setting $\pi$ is a parameter, not something included in the posterior. The only thing included in the posterior in this situation would be $A$.

What you could do instead is to place a hyper-prior on $\pi\sim\textrm{Dirichlet}(\alpha)$. This gives us the model:

\begin{equation} \begin{split} X | m,s,A & \sim \sum_{i=1}^3\textrm{I}_{(A=A_i)}\textrm{N}(m_i,s_i)\\ A | \pi & \sim \textrm{Multinoulli}(\pi)\\ \pi | \alpha &\sim \textrm{Dirichlet}(\alpha) \end{split} \end{equation}

Which makes $\alpha$ the parameter at the deepest level of the hierarchy, and $\pi$ a part of the posterior distribution. Implementation of this requires Gibbs sampling, by sampling $A$ at each iteration and updating $\pi$ using the conjugacy of the Dirichlet to the Multinoulli/Multinomial.