I have a function $$l(\beta_0, \beta_1, \sigma^2) = -\frac{n \log(2\pi)}{2} - n \log \sigma - \frac{1}{2 \sigma^2} \sum_{i=1}^{n} (y-\beta_0 - \beta_1 x_i)^2$$ which is the log-likelihood function of MLE for SLR (Simple Linear Regression) for a regression model of $y = \beta_1x + \beta_0$.
Taking the partial derivative with respect to $\beta_1$ we have: $$\frac{\partial}{\partial \beta_1} = \frac{1}{\sigma^2} \sum_{i=1}^{n} x_i(y-\beta_0-\beta_1x_i)$$
So to find the maximum of this function we have to set this function to 0, which I got to the point of:
$$\sum_{i=0}^{n} x_i(y-\beta_0-\beta_1x_i) = 0$$
I am kind of at a loss on how to proceed from here when trying to find a root for this equation, it seems to be very difficult trying to find factors, I'm wondering if anyone can give me hints to proceed from this.
You can try expanding it out:
$$\displaystyle \sum_{i=0}^n x_iy - \beta_0\sum_{i=0}^nx_i-\beta_1\sum_{i=0}^nx_i=0$$
And then solve for $\beta_1$ to get:
$$\displaystyle \beta_1=\frac{\sum_{i=0}^nx_iy-n\beta_0}{\sum_{i=0}^n x_i}=y-\frac{n}{\sum_{i=0}^nx_i}\beta_0\space\space\space\space---(A)$$
But you still have a $\beta_0$ in the expression, so you need to find the partial derivative with respect to $\beta_0$ as well and solve that first. Then use the MLE you found for $\beta_0$ and plug it into what you found in (A).