The question is simple. We have two independent Poisson random variables. The parameters are $\theta_1, \theta_2$ respectively. We only have one single observation for both random variables. The general maximum likelihood without any restriction is $\frac{\theta_1^{x_1}}{x_1!} e^{-\theta_1} \frac{\theta_2^{x_2}}{x_2!} e^{-\theta_2}$ and I can find the MLE is $\theta_1 = x_1, \theta_2 = x_2$.
But now the restriction is $\theta_1 \leq \theta_2$. I don't know how to get MLE in this case.
This is a necessary step before I construct the likelihood ratio test. So I need to know MLE under the case $\theta_1 < \theta_2$.
If $x_1 < x_2$ then you can just use the maximum likelihood estimators.
If $x_1 \geq x_2$ then this won't work. In this case we want the maximum likelihood estimators to be equal to each other. This way $x_2 \leq \hat{\theta}_1 \leq \hat{\theta}_2 \leq x_1$ with $\hat{\theta}_1, \hat{\theta}_2$ being as close as possible to their unconstrained maximum likelihood estimates.
If $x_1 \geq x_2$, we want $\hat{\theta}_1, \hat{\theta}_2$ to be as close as possible to their unconstrained maximum likelihood estimates as possible because the closer they are, the higher the likelihood. The plot below shows the likelihood $\theta_1^{x_1} e^{-\theta_1} /x_1!$ for $x_1 = 3$. As $\theta_1$ gets closer to $3$, the likelihood increases. If we leave a gap between $\hat{\theta}_1, \hat{\theta}_2$ then we know we're not maximizing the likelihood, since we can increase it by moving $\hat{\theta}_1$ closer to $x_1$ or moving $\hat{\theta}_2$ closer to $x_2$.
Setting $\theta_1 = \theta_2$ we have likelihood $$\frac{\theta_1^{x_1+x_2}}{x_1!x_2!} e^{2\theta_1},$$ which is maximized when $$\theta_1 = \frac{x_1+x_2}{2}.$$
So in the case $x_1 \geq x_2$, use $$\hat{\theta}_1 = \frac{x_1+x_2}{2}, \hat{\theta}_2 = \frac{x_1+x_2}{2}.$$