Here's the Pareto distribution:
$$F(x; \theta_1, \theta_2) = 1 - \Big(\frac{\theta_1 }{x}\Big)^{\theta_2}, \qquad \theta_1 \le x, \qquad \theta_1, \theta_2 > 0$$
I have been trying to solve the maximum likelihood for this, but it is complicated and so is its logarithm. I tried many things but in vane. How does one show this?
Note that the definition of the Pareto distribution you are using is the cumulative distribution function, whereas in maximum likelihood estimation you need to use the probability density function, which for the Pareto distribution is given by $$f(x;\alpha,\beta) = \alpha {\beta^{\alpha}\over x^{\alpha+1}}, \qquad x \geq \beta, \quad \alpha,\beta > 0. $$ Compared to your equation, $\theta_1 = \beta$ and $\theta_2 = \alpha$.
The likelihood function of the Pareto distribution given a sample $x=(x_1,\dotsc, x_n)$ is given by $$\mathcal{L}(\alpha,\beta;x)=\prod\limits^n_{i=1}\alpha{\beta^\alpha\over x_i^{\alpha+1}}=\alpha^n \beta^{n\alpha}\prod\limits^n_{i=1}{1\over x_i^{\alpha+1}}. $$ Taking the log of the likelihood, we obtain $$\ln[\mathcal{L}(\alpha,\beta;x)] \equiv \ell(\alpha,\beta;x)=n \ln (\alpha) + n\alpha \ln(\beta) - (\alpha+1)\sum\limits^n_{i=1}\ln(x_i). $$ Since a higher $\beta$ will always result in a higher likelihood$—$$\ln(\beta)$ is monotonically increasing$—$we maximize the likelihood by setting $\hat{\beta}$ as high as possible. Since $\beta \leq x_i$ for all $i$, we maximize the likelihood by setting $\hat{\beta} = \min\limits_i x_i$, the smallest $x_i$ in the sample.
For $\alpha$, we set the partial derivative of $\ell$ with respect to $\alpha$ equal to $0$: \begin{align}{\partial \ell(\alpha,\beta) \over \partial \alpha} &= {n\over \alpha}+n \ln(\beta)- \sum\limits^n_{i=1}\ln(x_i)=0.\end{align} Therefore, $$\hat{\alpha}= {n \over \sum\limits^n_{i=1}\ln(x_i)-n\ln(\hat{\beta}) }. $$ I hope this helps.