Expected value of a Bernoulli parameter given a sample

137 Views Asked by At

I am a bit stuck with this simple question.

Consider a Bernoulli distribution with an unknown parameter. All we know is that the parameter is $p_L$ with probability $q$ or $p_H$ $(>p_L)$ with probability $1-q$. So the expected value of the parameter is $qp_L + (1-q)p_H$.

Question: We are given a sample from the distribution denoted $\theta\in\{0,1\}$. What is the expected value given the sample?

Context: Some context for the question since it was asked. A common way to quantify one's uncertainty over a Bernoulli parameter is via the beta distribution. I am trying to better understand the nature of the beta distribution by looking at the case where the parameter can only take on finitely many values (in this case two).


Attempt:

Given the sample $\theta$, I think we need to compute the probabilities $P(p=p_L\mid\theta)$ and $P(p=p_H\mid\theta) = 1- P(p=p_L\mid\theta)$. My computations for $\theta=0$ yield: $$P(p=p_L\mid\theta=0) = \frac{(1-p_L)q}{(1-p_L)q+(1-p_H)(1-q)}$$ $$P(p=p_H\mid\theta=0) = \frac{(1-p_H)(1-q)}{(1-p_L)q+(1-p_H)(1-q)}$$ So, I think the updated expectation given the sample $\theta=0$ is: $$\frac{(1-p_L)q}{(1-p_L)q+(1-p_H)(1-q)}p_L + \frac{(1-p_H)(1-q)}{(1-p_L)q+(1-p_H)(1-q)}p_H$$ This seems a bit strange to me since we have squared terms like $p_L^2$ and $p_H^2$ that arise above.

1

There are 1 best solutions below

1
On

Let $X$ have Bernoulli distribution with parameter $q$, and set $Y=p_LX + p_H(1-X)$. Then \begin{align} \mathbb E[Y] &= \mathbb E[p_LX + p_H(1-X)]\\ &= p_L\mathbb E[X] + p_H\mathbb E[1-X]\\ &=p_L q + p_H(1-q). \end{align}