Bernoulli Uniform Bayes Estimator

905 Views Asked by At

enter image description here My answer comes out as $(p|X)$~$BETA(x+1,-x+2)$, indicating that $p_{Bayes}=\frac{x+1}{3}$, but apparently the correct answer is $p_{Bayes}=\frac{\sum x_i+1}{n+2}$. I don't understand where this answer comes from, however; can somebody here explain it? Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

You are calculating the posterior distribution incorrectly. I'll use $\theta$ instead of $p$ to avoid confusing the notation, so that $\theta \sim U(0,1)$ and $p(\theta) = 1$:

\begin{align*} p(\theta | X) &= p(X | \theta) p(\theta)\\ &= \prod_{i=1}^n p(X_i|\theta) p(\theta)\\ &= p(\theta) \prod_{i=1}^n \theta^{X_i} (1- \theta)^{1-X_i}\\ &= 1 \times \theta^{n \bar{X}}(1- \theta)^{n-n\bar{X}}\\ &= \theta^{n \bar{X}}(1- \theta)^{n-n\bar{X}} \end{align*}

which is the kernel of a beta distribution with parameters:

$$ B(\alpha = n \bar{X} + 1, \beta = n-n \bar{X} +1 ) $$

where $n \bar{X} = \sum_{i} X_i$.

Then, the bayes estimator of $\theta$ with squared error is simply the posterior mean (mean of a beta distribution with the above parameters):

\begin{align*} \theta_{\text{bayes}} &= E(\theta|X) \\ &= \frac{\alpha}{ \alpha + \beta}\\ &= \frac{ n \bar{X} + 1}{ n \bar{X} + 1 + n-n \bar{X} +1}\\ &= \frac{n \bar{X} +1}{n+2}\\ & = \frac{\sum_i X_i +1}{n+2}\\ \end{align*}

To solve for the Bayes estimator of $\theta(1-\theta)$, apply what we just did. The Bayes estimator under SE loss is simply the posterior expectation of the thing we are trying to estimate.

$$ E(\theta(1-\theta)) = E(\theta) - E(\theta^2) = E(\theta) - (V(\theta) + E(\theta)^2) $$

We already know what $E(\theta)$ is from above, and $V(\theta)$ is just:

$$ \frac{\alpha \beta}{ (\alpha + \beta)^2 ( \alpha + \beta + 1)} $$