A random variable $X$ follows the distribution given below: $$\left(X=x|\gamma\right)= \frac{{2 \choose x}\gamma^x\left(1-\gamma\right)^{2-x}}{1-\left(1-\gamma\right)^2}, \; x= \{1,2\}$$ A sample of $\textbf{n}$ i.i.d random varaibles observed. Estimate the parameter $\gamma$.
log likelihood function, $$ \begin{align*} \ln L &=\frac{1}{1-\left(1-\gamma\right)^2}\left( \sum_{i=1}^n\ln{2 \choose x_i}+\sum_{i=1}^nx_i\ln\gamma+\sum_{i=1}^n(2-x_i)\ln\left(1-\gamma\right)\right) \end{align*} $$ The expression is too much ugly. How to differentiate it with respect to $\gamma$? And does the information $x=\{1,2\}$ have some significance?
$$\Pr[X = x \mid \gamma] = \begin{cases} 1 - \frac{\gamma}{2-\gamma}, & x = 1 \\ \frac{\gamma}{2-\gamma}, & x = 2. \end{cases} \tag{1}$$
So the original comment that $X$ is essentially Bernoulli distributed (specifically, $X - 1$ is Bernoulli) is correct, and the likelihood of $\gamma$ given a sample of size $n$ is straightforward to compute:
$$\mathcal L(\gamma \mid \boldsymbol x) = \left(1 - \frac{\gamma}{2-\gamma}\right)^{\sum \mathbb 1(x_i = 1)} \left(\frac{\gamma}{2-\gamma}\right)^{\sum \mathbb 1(x_i = 2)}. \tag{2}$$
For convenience, let us define the sufficient statistic $m = \sum \mathbb 1(x_i = 2)$, which simply counts the number of observations in our sample that have value $2$; then $n-m$ is the number of observations that have value $1$. Also, let $p = \gamma/(2-\gamma)$, so the likelihood is expressible in terms of $m$: $$\mathcal L(\gamma \mid m) = p^m (1-p)^{n-m}. \tag{3}$$ Now this should look much more familiar. The critical point that maximizes $\mathcal L$ corresponds to the choice $\hat p = m/n$, and by invariance, it follows that $$\hat \gamma = \frac{2m}{m+n} \tag{4}$$ where again, $m$ is defined as above. Alternatively we may express $(4)$ in terms of the sample total $s = \sum x_i$, since $s = m+n$: $$\hat \gamma = 2 - \frac{2n}{s} = 2 - \frac{2n}{\sum x_i}. \tag{5}$$