Bayes Estimator for Bernoulli Variance

171 Views Asked by At

I have the following question, which I have also posted here, however nobody has answered, so I thought I would post it here as well.

Let $X_1,\dots,X_n$ be independent, identically distributed random variables with $$ P(X_i=1)=\theta = 1-P(X_i=0) $$

where $\theta$ is an unknown parameter, $0<\theta<1$, and $n\geq 2$. It is desired to estimate the quantity $\phi = \theta(1-\theta) = nVar((X_1+\dots+X_N)/n)$.

Suppose that a Bayesian approach is adopted and that the prior distribution for $\theta$, $\pi(\theta)$, is taken to be the uniform distribution on $(0,1)$. Compute the Bayes point estimate of $\phi$ when the loss function is $L(\phi,a)=(\phi-a)^2$.

Now, my solution so far:

It can easily be proven that $a$ needs to be the mean of the posterior. Also, when $\theta$ spans $(0,1)$, $\phi$ spans $(0,\frac{1}{4}]$. Hence, we have that $$ a = \int_0^{\frac{1}{4}}\phi\cdot f(\phi|x_1,\dots,x_n)d\phi. $$

Now, we have that $$ f(\phi|x_1,\dots,x_n)\propto f(x_1,\dots,x_n|\phi)\cdot \pi(\phi). $$

Given that $\theta$ follows $U[0,1]$, we get that $\phi$ follows:

$$ P(\Phi\leq t) = \frac{1-\sqrt{1-4t}}{2} $$

Hence we can derive $\pi(\phi)$. However, I am not sure how to derive $f(x_i|\phi)$.

Help proceeding forward and letting me know if I have made any mistakes so far would be very appreciated.

1

There are 1 best solutions below

0
On

Since X is a bernoulli random variable, we can say that $f(x_1,\dots,x_n|\theta)= \theta^{\sum_{i=1}^{n} X_i}(1-\theta)^{n-\sum_{i=1}^{n} X_i}$ but given is $\phi$, so by equation $\phi = \theta(1-\theta)$, write $\theta = f(\phi)$ and substitute in above equation.