What is the Maximum Likelihood Estimator of the variance of a sample $X_1\dots X_n$ of i.i.d variables with normal distribution $X_i\sim \mathcal{N}(\mu_0,\sigma^2)$ when the mean $\mu_0$ is known? Show that this estimator is unbiased.
I've worked out the general case $X_i\sim \mathcal{N}(\mu,\sigma^2)$ with unknown mean $\mu$, using the Likelihood function. In that case, the Maximum Likelihood estimator of the variance turns out to be biased. I'm stuck as in how to use the known value of the mean to convert the estimator into an unbiased one.
The log-likelihood of a Gaussian sample $X_1 \dots X_n$ with $X_i\sim \mathcal{N}(\mu_0,\sigma^2)$ is \begin{equation} L = -\frac{n}{2} \ln(2\pi) - \frac{n}{2} \ln\sigma^2 - \frac{1}{2\sigma^2} \sum_{i=1}^n \left(X_i - \mu_0\right)^2 \, , \end{equation} where $\sigma^2$ is the parameter to estimate. The derivative of the log-likelihood with respect to $\sigma^2$ vanishes at $\sigma^2 = \hat{\sigma}^2_n$, with \begin{equation} \hat{\sigma}^2_n = \frac{1}{n} \sum_{i=1}^n \left(X_i - \mu_0\right)^2 \, . \end{equation} One shows that $\sigma^2 = \hat{\sigma}^2_n$ corresponds to a maximum of the log-likelihood. Therefore, $\hat{\sigma}^2_n$ defines the maximum likelihood estimator. Now, we compute the expected value of the estimator \begin{equation} E(\hat{\sigma}^2_n) = \frac{1}{n} \sum_{i=1}^n E\left(\left(X_i - \mu_0\right)^2\right) . \end{equation} Since $\mu_0 = E(X_i)$, one has $E\left(\left(X_i - \mu_0\right)^2\right) = \sigma^2$ for all $i$. Finally, $E(\hat{\sigma}^2_n) = \sigma^2$. The estimator is unbiased.
For comparison, in the case where the mean is unknown, the log-likelihood \begin{equation} L = -\frac{n}{2} \ln(2\pi) - \frac{n}{2} \ln\sigma^2 - \frac{1}{2\sigma^2} \sum_{i=1}^n \left(X_i - \mu\right)^2 \end{equation} is maximized with respect to the parameter $(\mu,\sigma^2)$. The maximum likelihood estimator is \begin{aligned} &\tilde{\mu}_n = \overline{X}_n = \frac{1}{n} \sum_{i=1}^n X_i\, ,\\ &\tilde{\sigma}^2_n = \frac{1}{n} \sum_{i=1}^n \left(X_i - \overline{X}_n\right)^2\, , \end{aligned} where the variance estimator $\tilde{\sigma}^2_n$ is biased: $E(\tilde{\sigma}^2_n) = \frac{n-1}{n}\sigma^2 \neq \sigma^2$.