Combining two distributions to get a single one?

1.7k Views Asked by At

Consider the following experiment:

Let $X$ be a random variable which has a beta distribution with some fixed parameters.

Let $Y$ be a Bernoulli random variable with parameter $X$.

In other words, we have a two step process: 1) get a number $x \in [0,1]$ from a beta distribution 2) get a number $y \in \{0,1\}$ from a coin flip with heads probability $x$.

Is it possible to say how $y$ will be distributed (in terms of the beta distribution parameters, I suppose)?

1

There are 1 best solutions below

0
On

Yes. This is a very common application of Bayesian inference. In fact the Wikipedia article for prior probability has your exact example in the lede. The beta distribution is described as a prior distribution, and the Bernoulli distribution is described as a sampling distribution.

Here is the answer to your question: with a $X \sim \operatorname{Beta}(a,b)$ prior and a $\operatorname{Bernoulli}(X)$ likelihood, the posterior distribution is $\operatorname{Beta}(a',b')$ where $a'$ is the number of successes observed and $b'$ is the number of failures observed.

On the first observation no data has been collected yet, so $a'=a$ and $b' = b$ and we just have $\operatorname{Beta}(a,b)$. This is also because the expected value of Bernoulli distribution is just $p$.