How do you take the product of Bernoulli distribution?

826 Views Asked by At

I have a prior distribution,

$$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K p(\theta_i|\pi).$$

$\theta_i$ can equal $0$ or $1$, so I am using a Bernoulli distribtion so that

$$p(\boldsymbol\theta|\pi)=\prod\limits_{i=1}^K \pi^{\theta_i}(1-\pi)^{1-\theta_i}.$$

I then want to add this distribution onto my marginal likelihood to make up my posterior. Should I solve it as $$p(\boldsymbol\theta|\pi)=\pi^{K\theta_i}(1-\pi)^{K(1-\theta_i)} \, \, ?$$

But then is the product of bernoulli distributions the binomial distribution?

Then should my answer be

$$p(\boldsymbol\theta|\pi)=\left(\begin{array}c K\\ t \end{array}\right)\pi^{t}(1-\pi)^{K-t)} $$

where $K$ is the maximum number of $\theta_i$'s allowed, and $t=\{0, 1\}$ , (i.e. $t=0\, \, \text{or}\, \, 1$)?

What form do I add this prior to my likelihood?

1

There are 1 best solutions below

2
On

The equation you have can be represented as follows: $$p(\boldsymbol x|\theta)=\prod\limits_{i=1}^K \theta^{x_i}(1-\theta)^{1-x_i}=\theta^{\sum_i x_i}(1-\theta)^{K-\sum_i x_i}$$

We have the Bayes rule

$$p(\theta|x)=\frac{p(x|\theta)p(\theta)}{p(x)}$$

as $\theta$ is known, we have the joint density $p(x,\theta)=p(\theta,x)$ which specifies all the information we need.