Consider $N(\mu,\sigma^2)$ and pick independent $X_1, \cdots, X_n$.
It is a well known fact that the MLE estimator $$ S_n^2= \frac 1 n\sum(X_i - \overline X_n)^2 $$ is a biased estimator for $\sigma^2$.
If we suppose $\mu= 0$, then I get it is unbiased. Am I wrong? Thanks!
If you somehow know the value of $\mu,$ regardless of whether it's $0$ or not, then you should use that instead of the estimate $\bar X_n,$ and in that case the MLE of $\sigma^2$ is $\displaystyle \frac 1 n \sum_{i=1}^n (X_i-\mu)^2,$ and that is unbiased. To see that it's unbiased, just observe that $\operatorname E((X_i-\mu)^2) = \sigma^2.$
But $\displaystyle \frac 1 n \sum_{i=1}^n (X_i-\bar X)^2 $ still has expected value $\frac{n-1} n \sigma^2.$ If $\mu$ is known, then that estimator is still biased but is not the MLE.
To say that $\mu$ is “known” simply means the family of distributions you're working with is $\left\{ N(\mu,\sigma^2): \sigma^2>0 \right\}$ rather than $\left\{ N(\mu,\sigma^2) : \sigma^2>0\ \&\ \mu\in\mathbb R \right\}.$ The likelihood function is then a function of $\sigma$ or of $\sigma^2$ rather than of two variables, one of which is $\mu.$