I am trying to derive the bias and variance of $\hat{\mu}_{MLE} = \frac{-\sum_{i=1}^{n}{\ln(x)}}{2\sigma^2n}$, where $x \sim \mathrm{LogNorm}(\mu, \sigma^2)$. So far I have $$ \mathrm{Bias} = \mu\left(\frac{1-2\sigma^2}{2\sigma^2}\right)$$ but I am at a bit of a loss for variance calculation. It seems like I'm stuck at $$c \times \mathbb{E}\left[\left(\sum_{i=1}^{n}{\ln(x)}\right)^2\right]$$ where $c$ is some gnarly constant that I've decided to ignore for now. Can anyone offer some help?
edit: I might have figured something out but I could be wrong. If $x \sim \mathrm{LogNorm}(\mu, \sigma^2)$, then b.d. $x=\mathrm{e}^y$ where $y \sim \mathrm{N}(\mu^*=\ln(\mu)-\frac{\sigma^2}{2}, \sigma^{2*}= ...)$
The lognormal likelihood for a sample with known $\sigma$ is $$\mathcal L(\mu \mid \boldsymbol x, \sigma) \propto \exp \left(-\frac{1}{2\sigma^2} \sum_{i=1}^n (\mu - \log x_i)^2\right),$$ so the log-likelihood is $$\ell(\mu \mid \boldsymbol x, \sigma) = -\frac{1}{2\sigma^2} \sum_{i=1}^n (\mu - \log x_i)^2,$$ which has derivative $$\frac{\partial \ell}{\partial \mu} = -\frac{1}{2\sigma^2} \sum_{i=1}^n 2(\mu - \log x_i),$$ whose critical point is at $$\hat\mu = \frac{1}{n} \sum_{i=1}^n \log x_i.$$ This should be your MLE, not what you wrote. In fact, your estimator would not make sense if $\sigma$ is very large, which would make your estimate for $\mu$ very small.