Context
Given there are 2 groups that can be modelled as a geometric distribution as follows:
\begin{align*} f(x_i;p_1) &= p_1(1-p_1)^{x_i - 1} \; x_i = 1,2,... \; 0<p_1 <1 \\ f(y_i;p_2) &= p_1(1-p_2)^{y_i - 1} \; y_i = 1,2,... \; 0<p_2 <1 \end{align*}
Suppose that $p1 = p2 = p$. Plot the likelihood for the function for $p$ given that for group 1, $n_1 = 60$, and for group 2, $n_2 = 40$ and the $\sum x_i = 205$ and $\sum y_i = 215$.
My attempt
From what I understand, given that the random variables $X,Y \overset{iid}{\sim} \mathrm{Geometric}(p)$ and that $p_1= p_2$, we can find the joint probability function as follows.
\begin{align} f_{X,Y} &= f_{X}(x)f_{Y}(y) \\ &= p(1-p)^{x_i-1}p(1-p)^{y_i-1} \\ &= p^2(1-p)^{x+y-2} \end{align}
From that, we can take find the likelihood equation for the joint function.
\begin{align} L(x_i,y_i| p) &= \prod_{i = 1}^{n}p^2(1-p)^{x+y-2} \\ &= (p^2)^{n}(1-p)^{\sum_{i = 1}^{n} (x_i + y_i - 2n)} \\ &= p^{2n}(1-p)^{\sum_{i = 1}^{n} (x_i + y_i - 2n)} \end{align}
The log-likelihood function would be as follows.
\begin{align} \ell(x_i, y_i | p) &= \ln{(p^{2n}(1-p)^{\sum_{i = 1}^{n} (x_i + y_i - 2n)})} \\ &= 2n\ln{(p)} + \left(\sum_{i = 1}^{n} (x_i + y_i - 2n)\right)\ln{(1-p)} \end{align}
My question
I am unsure as how to plot the function. Given that there are 2 groups with differing numbers, I think my likelihood equation I have calculated may be wrong. I am unsure if the resultant likelihood and log-likelihood function should contain parameters $n_1$ and $n_2$. Any help would be appreciated to see if I am missing something or I am misunderstanding the question.
When you combined the joint likelihood, you assumed that the $x_i$ and $y_i$ could be paired up, when in fact, they might not be, because the sample sizes $n_1$ and $n_2$ could be unequal. This is the source of your confusion.
Rather, it is more sensible to initially think of the joint likelihoods of the samples $(x_1, \ldots, x_{n_1})$ and $(y_1, \ldots, y_{n_2})$ as separate; e.g.
$$\mathcal L(p_1 \mid x_1, \ldots, x_{n_1}) \propto \prod_{i=1}^{n_1} p_1 (1 - p_1)^{x_i - 1} = \left(\frac{p_1}{1-p_1}\right)^{n_1} (1 - p_1)^{\sum x_i}$$ and similarly $$\mathcal L(p_2 \mid y_1, \ldots, y_{n_2}) \propto \left(\frac{p_2}{1-p_2}\right)^{n_2} (1 - p_2)^{\sum y_i}.$$ Note that when expressed in this manner, the likelihoods are functions of the sufficient statistics $\sum x_i$ and $\sum y_i$, the sample totals of each group, so instead of $x_1, \ldots, x_{n_1}$ and $y_1, \ldots, y_{n_2}$, we can just use $\sum x_i$ and $\sum y_i$.
Now the joint likelihood of both samples when $p_1 = p_2 = p$ is simply $$\begin{align} \mathcal L(p \mid \textstyle\sum x_i, \textstyle\sum y_i) &\propto \mathcal L(p \mid x_1, \ldots, x_{n_1}) \mathcal L (p \mid y_1, \ldots, y_{n_2}) \\ &= \left(\frac{p}{1-p}\right)^{n_1 + n_2} (1-p)^{\sum x_i + \sum y_i}. \end{align}$$ Then your log-likelihood is $$\ell(p \mid \textstyle\sum x_i, \textstyle\sum y_i) = (n_1 + n_2) \log p + (\textstyle \sum x_i + \textstyle \sum y_i - n_1 - n_2)\log (1-p).$$ I leave it as an exercise to show that this implies $$\hat p = \frac{\sum x_i + \sum y_i}{n_1 + n_2},$$ which is of course what your intuition would expect to be the maximum likelihood estimate.
The other error you made is writing $$\sum_{i=1}^n (x_i + y_i - 2n)$$ when in fact you already summed the $-1$ terms, so it should have been $$\sum_{i=1}^n (x_i + y_i) - 2n.$$ Then you can see that your likelihood turns out to be the special case $n_1 = n_2 = n$.