Bayesian Problem... I think

72 Views Asked by At

Let X be the number of coin tosses until heads is obtained. Without knowing that the coin is fair, I assume that the probability of heads is uniformly distributed.

How would I find the distribution of X. From my knowledge I would need to first know the conditional distribution : $X\mid Y=y$ , correct? Should I be able to get this from the given information?

I also need to show that the expected value of X, $E[X]$ does not exist.

1

There are 1 best solutions below

3
On BEST ANSWER

So $Y$ is a random variable uniformly distributed in $(0,1)$. \begin{align} \Pr(X=x) & = \mathbb E(\Pr(X=x\mid Y)) = \mathbb E\left( (1-Y)^{x-1} Y\right) = \int_0^1 (1-y)^{x-1} y f_Y(y)\,dy \\[8pt] & = \int_0^1 (1-y)^{x-1} y \,dy = \int_0^1 u^{x-1}(1-u)\,du = \int_0^1 u^{x-1} - u^x \, du \\[8pt] & =\frac1x - \frac{1}{x+1}. \end{align}

Postscript some hours later: Suppose one wants to know the conditional probability distribution of $Y$ given $X$. The fact that $X=x$ means that when the coin was repeatedly tossed, the first time a "head" appeared was on the $x$th trial. Suppose instead one had observed the random variable $W$, defined as the number of "heads" in the first $x$ trials. If it had turned out that $W=1$, then that would not imply that $X=x$, but it would at least be consistent with $X=x$. Here's an exercise: Show that the conditional distribution of $Y$ given that $X=x$ is the same as the conditional distribution of $Y$ given that $W=1$, despite the fact that $\Pr(X=x) \ll \Pr(W=1)$.