Let $X_1,X_2,...;X_n$ be a random sample with $X_i$~$Binomial(m,p)$ for $i=1,...,n$ and $m=1,2,3,...$ and let $p\in (0,1)$. We assume $m$ is known and we are given the following data $x_1,...,x_n\in\{0,...,m\}$
Write up the log-likelihood function and find the MLE $\hat{P}ML$ for p
I'm not quite sure how to approach this. This is what I've tried:
I believe the likelihood function of a Binomial trial is given by
$P_{X_i}(x;m)=$ ${m}\choose{x} $$p^x(1-p)^{m-x}$
From here I'm kind of stuck. I'm uncertain how I find/calculate the log likelihood function.
I've understood the MLE as being taking the derivative with respect to m, setting the equation equal to zero and isolating m (like with most maximization problems). So finding the log likelihood function seems to be my problem
Edit: I might be misunderstanding it but could the log likelihood function simple be log of the likelihood function? so $log(P_{X_i}(x;m))$
The likelihood is
$$L(p)=\prod_{i=1}^nP_p(X=x_i)=\prod_{i=1}^{n}{m\choose x_i}p^{x_i}(1-p)^{m-x_i}$$
The log-likelihood is thus
$$\log L(p)=\log\left(\prod_{i=1}^{n}{m\choose x_i}\right)+\log(p)\sum_{i=1}^nx_i+\log(1-p)\left(nm-\sum_{i=1}^nx_i\right)$$
Let $M=\log\left(\prod_{i=1}^{n}{m\choose x_i}\right)$ (which does not depend on $p$):
$$\log L(p)=M+\log(p)\sum_{i=1}^nx_i+\log(1-p)\left(nm-\sum_{i=1}^nx_i\right)$$
$$\log L(p)=M+n\log(p)\bar x+n\log(1-p)(m-\bar x)$$
$$\dfrac{\partial\log L}{\partial p}=\frac{n\bar x}{p}-\frac{n(m-\bar x)}{1-p}$$
This last expression is zero if
$$\frac{\bar x}{p}=\frac{m-\bar x}{1-p}$$
$$\bar x-\bar xp=mp-\bar xp$$
Hence the ML estimator is:
$$\hat p=\dfrac{\bar x}{m}$$