Biased coin toss

210 Views Asked by At

Let $p$, $q$ be values in $[0,1]$ and $\alpha \in [0,1]$. Assume $\alpha$ and $q$ known, and that $p$ is unknown parameter we would like to estimate. A coin is tossed n times, resulting in the sequence of zero one valued random variables $X_1, X_2,...X_N$. At each toss, independently of all other tosses, the coin has probability $p$ of success with probability $\alpha$ and probability $q$ of success with probability $1-\alpha$. What is the probability function and what is the MLE of $p$?

I think the success probability should be something like $p\alpha +q(1-\alpha)$,but don't know for sure.

1

There are 1 best solutions below

2
On BEST ANSWER

As you said $P(X=1) = p\alpha+q(1-\alpha)$

The probability mass is $f(x_i;p)=p^{x_i}(1-p)^{1-x_i}\alpha+q^{x_i}(1-q)^{1-x_i}(1-\alpha)$

And finally the likelihood function is $L(p)=\prod_{i=1}^N f(x_i;p)$

Edit (thanks to @angryavian)

You can use $N_1$ as the number of successes in your tosses, and $N_0$ the failures. Now you have $L(p)=(p\alpha+q(1-\alpha))^{N_1}((1-p)\alpha+(1-q)(1-\alpha))^{N_0}$

In order to simplify, look for the maximum of $log(L(p))$ rather than $L(p)$.

$log(L(p))=N_1 log(p\alpha+q(1-\alpha))+N_0 log((1-p)\alpha+(1-q)(1-\alpha))$

${\partial log(L(p))\over \partial p }= N_1{\alpha \over p\alpha+q(1-\alpha)} +N_0{ -\alpha\over (1-p)\alpha+(1-q)(1-\alpha)} =0 $

Let's multiply by $(p\alpha+q(1-\alpha))((1-p)\alpha+(1-q)(1-\alpha))$

$0=N_1((1-p)\alpha+(1-q)(1-\alpha))+N_0(p\alpha+q(1-\alpha))$

$p={ N_0 q (\alpha -1)+N_1( q(1-\alpha) -1) \over \alpha (N_0-N_1)}$

Last equation was simplified using WolframAlpha.

And that would be your MLE.