Given n observations $X_1, X_2, \dots, X_n$ from a random sample with Bernoulli probability function $$ \operatorname{Pr}(X=k) = p^k(1-p)^{1-k}, \text{ for }k=0,1. $$ Denote the method of moment estimator of $p$ as $\tilde{p}$ and the MLE estimator of $p$ as $\hat{p}$
Here is what I got so far.
For MOM, $$ \operatorname{E}(X) = \mu_{\text{sample}} \;\longrightarrow\; \tilde{p} = \dfrac{\sum_{i=1}^n X_i}{n} $$
For MLE, $$ l(\hat{p};x) = \sum_{i=1}^n X_i(\log\hat{p}) + \sum_{i=1}^n (1 - X_i)(\log(1-p)), $$ take the derivative and set the equation to zero gave me $\hat{p} = \dfrac{\sum_{i=1}^n X_i}{n}$
It seems like the two method gave me the same expression for the estimators; however, I got a hint for this problem as $\tilde{p}$ and $\hat{p}$ are not the same. So I am confused, but unsure about what I did wrong.
Any help would be greatly appreciated!