I need to show $\underline{\hat{\mu}}=\underline{\bar{y}}$:
$$L(y_1, y_2, ...) = \prod _{i=1}^n f(y_i,\underline{\mu},\underline{\Sigma})$$
$$ = \prod _{i=1}^n \frac{1}{\biggl(\sqrt{2 \pi}\biggl)^p |\Sigma|^{1/2}}e^{-\frac{1}{2}(y_i -\mu)'\Sigma ^{-1}(y_i -\mu)} $$
Taking log:
$$= -np \log{\biggl(\sqrt{2\pi}\biggl)}-\frac{n}{2}\log{(\Sigma)}-\frac{1}{2}\sum_{i=1}^n(y_i -\mu)'\Sigma ^{-1}(y_i -\mu)$$
Maximising:
$$ 0 = \frac{-1}{2} \times 2 \sum^{n}_{i=1} (y_i -\mu)$$
Then, I do not know how to proceed. I presume
$$ n\mu =\sum^{n}_{i=1} y_i $$
You have done everything correct. You've reached $$0 = \frac{-1}{2} \times 2 \sum^{n}_{i=1} (y_i -\mu)$$ or more formally (since your maximum is attained at the MLE) $$0 = \frac{-1}{2} \times 2 \sum^{n}_{i=1} (y_i - \hat{\mu})$$ where $\hat{\mu}$ is your MLE. Then $$0 = \sum^{n}_{i=1} (y_i - \hat{\mu})$$ i.e. $$0 = \sum^{n}_{i=1} y_i -\sum^{n}_{i=1} \hat{\mu}$$ But $\hat{\mu}$ is a constant being summed up $n$ times, so $$0 = \sum^{n}_{i=1} y_i - n \hat{\mu}$$ Finally, $$\hat{\mu} = \frac{1}{n} \sum^{n}_{i=1} y_i = \bar{y}$$ is the MLE estimate, i.e. the sample mean of the data is the MLE estimate of $\mu$.