Estimator for binomial distribution

215 Views Asked by At

I have a question from my introduction to mathematical statistics book. I'm working on the following problem.

We have an urn with a ratio of white balls to black balls of $\frac{p}{1-p}$. We draw balls one by one with replacement, continuing until we draw a white ball. Let $Y_{i}$ be the number of draw necessary. We repear this process $n$ times, giving numbers $Y_{i},...Y_{n}$. Determine the maximum likelihood for an estimator for $p$.

I denote the estimator by $\hat{p}$ and notice that the chance of picking a white ball now is $\hat{p}$. I use the log function to find our estimator.

$\log \Pi_{i=1}^{n} (1-\hat{p})^{n}\hat{p}=\Sigma_{i=1}^{n}\log((1-\hat{p})^{n}\hat{p})=n^{2}\log(1-\hat{p})+n\log(\hat{p})$.

Now I differentiate this with respect to $\hat{p}$ and find $\frac{\delta}{\delta \hat{p}}=\frac{n}{\hat{p}}-\frac{n^{2}}{1-\hat{p}}$ and if I set this equal to $0$ I find $\hat{p}=\frac{n}{n-n^{2}}$ which I find hard to believe since I would expect this to be equal to the given ration.

Any suggestions on where I go wrong?

Cheers!

1

There are 1 best solutions below

0
On BEST ANSWER

Thanks to JimB I worked out the following solution:

We let $\hat{p}$ be an estimator for $p$ and construct a statistic $f(\hat{p})=(1-\hat{p})^{Y_{i}-1}\hat{p}$. Now we will compute the log-likelihood of this statistic to determine whether it is a good estimator for $p$.

\begin{equation} \begin{split} \log f(\hat{p}) &= \log \Pi_{i=1}^{n}(1-\hat{p})^{Y_{i}-1}\hat{p}\\ &= \sum_{i=1}^{n} \log((1-\hat{p})^{Y_{i}-1}\hat{p})\\ &= \sum_{i=1}^{n} \log((1-\hat{p})^{Y_{i}-1}) + \sum_{i=1}^{n}\log(\hat{p})\\ &= \sum_{i=1}^{n} (Y_{i}-1) \log (1-\hat{p}) + n \log (\hat{p})\\ &= \log (1-\hat{p})(n\bar{Y}-n)+n \log (\hat{p})\\ &= n\bar{Y}\log(1-\hat{p})-n\log(1-\hat{p})+n\log(\hat{p}). \end{split} \end{equation}

Now we will differentiate the log-likelehood with respect to $\hat{p}$ and set this equal to zero to find the maximum likelihood estimator for $p$.

\begin{equation} \begin{split} \frac{\delta \log f(\hat{p})}{\delta \hat{p}} &= \frac{n}{\hat{p}}+\frac{n}{1-\hat{p}}-\frac{n\bar{Y}}{1-\hat{p}} \\ &= \frac{n(1-\bar{Y})}{1-\hat{p}} +\frac{n}{\hat{p}}\\ &= \frac{n(1-\hat{p}\bar{Y})}{\hat{p}(1-\hat{p})}=0 \Rightarrow\\ \hat{p} \bar{Y} -1 &= 0 \Rightarrow\\ \hat{p} &= \frac{1}{\bar{Y}}. \end{split} \end{equation}

Thus, $\hat{p}$ is the maximum likelihood estimator for $p$.