I'm trying to find MLE of MVN($\mu, \Sigma$), i.e $N_k(\mu, \Sigma)$ with random sample $X_i, 1\le i \le n$.
It was easy to get $\widehat{\mu}= \bar{X}$ and $\hat{\Sigma} = \frac{1}{n} \sum_i (X_i - \bar{X})(X_i - \bar{X})'$ by using matrix differentiation.
However, to be rigorous I need to explain the followings.
log likelihood function is continuous and differentiable with respect to the parameters, i.e $(\mu, \Sigma)$.
Log-likelihood function goes to $-\infty$ as the parameter goes to its boundary.
As far as I know, second derivative of log-likelihood(observed information) is negative definite and #2 are sufficient conditions for existence of unique MLE.
Here's my opinion.
About #1: Since log likelihood function has quadratic terms of $\mu$, log likelihood function is continuous and differntiable w.r.t $\mu$.
Meanwhile, I can't explain why determinant of covariance matrix, which appears in log likelihood function, is continous and differentiable w.r.t $\Sigma$.
About #2: I also understand this relating to $\mu$, but not $\Sigma$. In other words, I wonder how to show $-\frac{n}{2}\log|\Sigma|-\frac{1}{2}\sum_i (X_i-\bar{X})'\Sigma^{-1}(X_i-\bar{X})$, this function has diminishing boundary as parameters of $\Sigma$ goes to its boundary.
Thanks for any comment in advance.
This article (much of which was written by me) explains how to find the MLE of $\Sigma$.
Sufficient conditions for existence and uniqueness of the MLE are not necessary conditions for the existence of the MLE. Thus it may be unnecessary to prove that the sufficient conditions hold.