$Y_1, Y_2,...,Y_n$ are iid $\mathcal{N}(\mu_i,\sigma^2)$
I have gotten down to $\sum{y_i}=\sum{\mu_i}$ through log-likelihood calculations and then taking the partial with respect to $\mu$ but I am not sure what I have found or if I have even found anything. I have calculated the maximum likelihood for normally distributed random variables that have common mean and variance but not like this problem where the means are not necessarily common.
I am not very familiar with a question like this using the notation of $\mu_i$ instead of just a vector of $\mu$ so I may just be overcomplicating things. If that is the case I sincerely apologize.
Any help is much appreciated.
As such you have $n+1$ parameters to estimate: $(\mu_1,.....,\mu_n, \sigma^2)$, thus you will have to take partial derivatives w.r.t. all of them: $$ \mathcal{L}(\vec{\mu}, \sigma^2) = \frac{1}{(2\pi \sigma^2)^n}\exp{\left\{-\frac{\sum_{i=1}^n(x_i - \mu_i)^2}{2\sigma^2}\right\} } $$ $$ l(\vec{\mu}, \sigma^2)=-\frac{n}{2}\ln(2\pi\sigma^2) -\frac{\sum_{i=1}^n(x_i - \mu_i)^2}{2\sigma^2} $$ $$ l'(\mu_i) = \frac{(x_i - \mu_i)}{\sigma^2} = 0 $$ $$ \hat{\mu}_i= x_i $$ $$ l'(\sigma^2) = -\frac{n}{2\sigma^2} + \frac{\sum_{i=1}^n(x_i - \mu_i)^2}{2\sigma^4} = 0 $$ $$ \hat{\sigma}^2= \frac{\sum_{i=1}^n(x_i - \mu_i)^2}{n}. $$ To assure that this is a indeed the MLE, you should check that the Hessian matrix is negative definite.