Let $X_1,..,X_n$ be an i.i.d. sample of geometric(p) random variables with unknown parameter $0<p<1$. I woluld like to find the Maximum-Likelihood estimate of p.
With the pmf $P(X=k)=p(1−p)^k $ for $k∈{1,2,3,…}$ and $0<p<1$ my result is the following: $\hat{p}=\left( \frac{1}{1+\frac{1}{n}\sum_{j=1}^n x_j} \right)$ if I assume that $\sum_{j=1}^n x_j\neq0$. Now, my question is: what happens if $\sum_{j=1}^n x_j=0$ and what is $\hat{p}$ in this case?
You have to choose between the two formulations of the geometric distribution
$0$ is a possible outcome with probability $p$. If so, then $k \in \{0,1,2,\ldots\}$ with $\mathbb P(X=k) = p(1-p)^k$. The sum of $n$ iid geometric random variables can be $0$ and the maximum likelihood estimator is $\hat{p}= \dfrac{1}{1+\frac{1}{n}\sum_i x_i}$
$0$ is not a possible outcome. If so, then $k \in \{1,2,3,\ldots\}$ with $\mathbb P(X=k) = p(1-p)^{k-1}$. The sum of $n$ iid geometric random variables must be at least $n$ and the maximum likelihood estimator is $\hat{p}= \dfrac{1}{\frac{1}{n}\sum_i x_i}$
In either case, if the obervation is the lowest possible ($0$ or $n$ respectively) then the maximum likelihood estimate is $\hat{p}=1$. I would say this even if you start with an open interval $(0,1)$; the alternative would be to say that $\hat{p}$ is arbitrarily close to $1$