I have a set of data
$$X \sim N_2(0,\Sigma) \quad \text{with} \quad \Sigma = \pmatrix{ \sigma^2 & r\sigma^2 \\ r\sigma^2 & \sigma^2}$$
and I am asked to find the Maximum Likelihood estimates of $r$.
Firstly I find the log likelihood of my data which is
$$-n\log(2\pi) - 2n\log(\sigma) - \frac{n}{2} \log(1-r^2) - \frac{1}{2\sigma^2(1-r^2)}\sum_i (x_{i1}^2 - 2rx_{i1}x_{i2} + x_{i2}^2)$$
Then I want to differentiate this w.r.t. to $r$ which gives
$$-\frac{n}{2}\bigg(\frac{-2r}{1-r^2}\bigg) - \frac{1}{2\sigma^2}\bigg(\frac{2r}{(1-r^2)^2}\bigg)\sum_i (x_{i1}^2 - 2rx_{i1}x_{i2} + x_{i2}^2) - \frac{1}{2\sigma^2(1-r^2)}\sum_i -2x_{i1}x_{i2}$$
Now the question says show that the MLE for $r$ is given as
$$\hat r = 2\frac{\sum_i x_{i1}x_{i2}}{\sum_i (x_{i1}^2 + x_{i2}^2)}$$
but I can't seem to get to this. Can someone please put me out of my misery. I've spent far too long on this.
Edit: I had some mistypes. It's $x_{i2}^2$ not $2x_{i2}^2$
Write the log-likelihood as $$ \ell(r,\sigma^2)=-\frac{n}{2}\log(1-r^2)-\frac{1}{2\sigma^2}\frac{1}{1-r^2}\left(A-2rB\right) - n\log\sigma^2, $$ where $$ A = \sum_{i}x_i^2+y_i^2, \qquad B\sum_{i}x_iy_i. $$ Then $\frac{\partial \ell}{\partial r}=0$ implies that $$ nr\sigma^2+\frac{1}{1-r^2}\left((r^2+1)B - Ar \right) =0 \tag{1}. $$ Also $\frac{\partial \ell}{\partial \sigma^2} = 0$ further implies $$ n\sigma^2 = \frac{1}{2(1-r^2)}\left(A - 2rB\right) \tag{2}. $$ Put $(2)$ into $(1)$ to get \begin{align} \frac{r}{2(1-r^2)}\left(A-2rB \right) + \frac{1}{1-r^2}\left((r^2+1)B - Ar \right) = 0, \end{align} or $$ rA - 2r^2B + 2r^2B + 2B - 2Ar = 0 \iff r=\frac{2B}{A}. $$
Couple more details on the derivation of $(1)$ \begin{align} \frac{\partial \ell}{\partial r}&=n\frac{r}{1-r^2} - \frac{1}{\sigma^2}\frac{r}{(1-r^2)^2}\left[ A-2rB \right] + \frac{1}{\sigma^2(1-r^2)}B \\ &=\frac{1}{\sigma^2(1-r^2)}\left(nr\sigma^2 +\frac{1}{1-r^2}\left[2r^2B - rA\right] + \frac{1-r^2}{1-r^2}B\right) \\ &=\frac{1}{\sigma^2(1-r^2)}\left(nr\sigma^2 + \frac{1}{1-r^2}\left[ (r^2 + 1)B + rA\right]\right). \end{align}