I have $X_1,X_2,\dots,X_n$ as random samples from a binomial distribution, with probability function:
$$p_X(x) = Pr(X=x) = {m \choose{n}}\alpha^x(1-\alpha)^{m-x},x=0,1,2,\dots,m$$ where $m$ is given and $\alpha \in (0,1)$ is an unknown parameter.
I want to show that the maximum likelihood estimator of $\alpha$ is given by the sample mean over $m$.
Now I imagine the first thing I must do is, find the maximum likelihood estimator.
I believe this is found by finding the joint distribution for my data, and equating the first derivative to be $0$, is this the right direction?
The likelihood function, assuming the samples are independent, is $$L(\alpha) = \prod_{i=1}^N p_X(x_i)= \prod_{i=1}^N {m \choose {x_i} } \alpha^{x_i} (1-\alpha)^{m-x_i}.$$ The next step is to take the logarithm of the likelihood function $$ \log L(\alpha) = \sum_{i=1}^N\log{m \choose {x_i} }+\sum_{i=1}^Nx_i \log(\alpha) +\sum_{i=1}^N(m-x_i) \log(1-\alpha), $$ and to equate its derivative with respect to $\alpha $ to zero, that is, $$ \frac{\partial \log L(\alpha)}{\partial \alpha}=\frac{1}{\alpha}\sum_{i=1}^N x_i -\frac{1}{1-\alpha}\sum_{i=1}^N (m-x_i)=0. $$ From this we deduce that the MLE is $$\hat \alpha=\frac1N\sum_{i=1}^N \frac{x_i}m.$$