Find posterior distribution of $\varepsilon$ in a uniform distribution on $[0, \frac{1}{6}]$

66 Views Asked by At

An unfair die rolls "6" with a probability $\frac{1}{6} + \varepsilon$, where $\varepsilon$ has a priori uniform distribution on $[0, \frac{1}{6}]$. Find posterior distribution of $\varepsilon$ knowing that the first toss was "6" and the second toss was "2".

A priori density function is $f_0(x) = \frac{1}{6} \chi_{[0, \frac{1}{6}]}(x)$ and the probabilities of tossing a six and a "non-six" are $\frac{1}{6} + \varepsilon$ and $\frac{5}{6} - \varepsilon$ respectively, so the probability of obtaining such sample is $$P_\varepsilon(\underline{X}) = P(X_1 = \text{"6"}, X_2 = \text{not "6"}) = (\frac{1}{6} + \varepsilon)(\frac{5}{6} - \varepsilon)$$

Therefore the posterior density function would be (please correct me if I'm wrong):

$$f_1(\varepsilon) = \frac{P_\varepsilon(\underline{X}) \cdot f_0(\varepsilon)}{\int \limits_{E} P_\varepsilon(\underline{X}) f_0(\varepsilon)} d\varepsilon$$

$$= \frac{(\frac{1}{6} + \varepsilon)(\frac{5}{6} - \varepsilon) \frac{1}{6} \chi_{[0, \frac{1}{6}]}(\varepsilon)}{\int \limits_0^{\frac{1}{6}} (\frac{1}{6} + \varepsilon)(\frac{5}{6} - \varepsilon) \frac{1}{6} d\varepsilon}$$

$$ = \frac{162}{5}(\frac{1}{6} + \varepsilon)(\frac{5}{6} - \varepsilon) \chi_{[0, \frac{1}{6}]}(\varepsilon)$$

Is my reasoning valid? Or should there be no characteristic function in the final formula, since we're considering a specific $\varepsilon$ at this point? I'm still confused with continuous variables in the Bayesian approach.

1

There are 1 best solutions below

0
On BEST ANSWER

I thought this was correct up to a constant factor, but after doing the work got the same answer.

I was thinking you have to take into account the number of arrangements of the event $\{\textrm{One Success}\land\textrm{One Failure}\}$, but if you reformulate the problem assuming the events are exchangeable "successes", and view the number of them as counts in the Binomial distribution, you get the same answer.

I'll just put the calculations up in case it helps anyone.

Letting $\gamma_i = I(X_i = 6)$ and $1-\gamma_i = I(X_i = \{6\}^c)$. Then, specifying $p = P[\gamma_i = 1] = \epsilon + \frac{1}{6}$. Under this specification $p$ has an induced prior of the form: $p\sim\textrm{Unif}(\frac{1}{6},\frac{2}{6})$. It also follows that:

\begin{equation} \begin{split} P[\sum_{i=\{1,2\}}\gamma_i = n] \sim \textrm{Binom}(n|2,p) \end{split} \end{equation}

From the exchangeability of $X_1$ and $X_2$, the binomial distribution is appropriate. We observed $\sum_{i=\{1,2\}}\gamma_i = 1$ which has probability $P[\sum_{i=\{1,2\}}\gamma_i = 1] = \binom{2}{1}p^1(1-p)^1 = 2p(1-p)$.

Doing the integration, we get the following posterior distribution on $p$:

\begin{equation} \begin{split} P[p|\underline{X}] & = \frac{2p(1-p)\frac{1}{6}I(p\in[\frac{1}{6},\frac{2}{6}])}{\int_{\frac{1}{6}}^\frac{2}{6}2p(1-p)\frac{1}{6}I(p\in[\frac{1}{6},\frac{2}{6}])}\\ & = \frac{162}{5}p(1-p)I(p\in[\frac{1}{6},\frac{2}{6}])\\ & = \frac{162}{5}p(1-p)I(p\in[\frac{1}{6},\frac{2}{6}]) \end{split} \end{equation}

This induces a posterior on $\epsilon$ of the form:

\begin{equation} \begin{split} P[\epsilon|\underline{X}] & = \frac{162}{5}(\epsilon - \frac{1}{6})(\frac{5}{6}-\epsilon)I(\epsilon\in[0,\frac{1}{6}]) \end{split} \end{equation}