I have a sample from multiple groups, $k=\{1,2,..,K\}$. I need to find the maximum likelihood estimates for $\mu^{(k)}$ and for $V$. Looking at here I can see how the entire right part becomes $Nd; N=n_1+...+n_K$. However, it doesn't help me to find an exact value for the mle of $\mu^{(k)}$ or for $V$.
2026-03-29 06:29:01.1774765741
Deriving the log-likelihood of a multivariate normal distribution by \mu
34 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in MAXIMUM-LIKELIHOOD
- What is the point of the maximum likelihood estimator?
- Finding a mixture of 1st and 0'th order Markov models that is closest to an empirical distribution
- How does maximum a posteriori estimation (MAP) differs from maximum likelihood estimation (MLE)
- MLE of a distribution with two parameters
- Maximum Likelihood Normal Random Variables with common variance but different means
- Possibility of estimating unknown number of items based on observations of repetitions?
- Defects of Least square regression in some textbooks
- What is the essence of Least Square Regression?
- Finding maximum likelihood estimator of two unknowns.
- Mean of experiment results is the maximum likelihood estimator only when the distribution of error is gaussian.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?

I try to reformulate the problem to see if I understood it correctly.
You have $K$ samples such that $\mathbf{X^{(i)}} \sim \mathcal{N}_d (\mu^{(i)},V)$ (in particular your data is not identically distributed) and want to compute each mean and the common covariance matrix.
From independence of the sample it follows that the log-likelihood function is
$$ \begin{split} \ell(\mu^{(i)},V) &= \log \prod_{i=1}^{K} f_{\bf{X^{(i)}}} = \sum_{i=1}^{K}\bigg(-\frac{d}{2}\log(2\pi)- \frac{1}{2}\log|V|-\frac{1}{2}(x^i-\mu^i)^T V^{-1}(x^i-\mu^i)\bigg) \\ &= -\frac{Kd}{2}\log(2\pi)-\frac{K}{2}\log|V|-\frac{1}{2}\sum_{i=1}^{K}(x^i-\mu^i)^T V^{-1}(x^i-\mu^i) \end{split} $$
To compute the MLE for the means and covariance matrix we will differentiate and equate the derivatives to $0$.
Means
Recall thet for a symmetric matrix $A$ that does not depend on $z$, $\frac{\partial(\mathbf{z^TAz})}{\partial \mathbf{z}}=2\mathbf{Az}$. Hence
$$ \frac{\partial \ell}{\partial \mu^i} = 2 \sum_{i=1}^{K}V^{-1}(x^i-\mu^i) = 0 \Rightarrow \hat{\mu}^i = \frac{1}{K}\sum_{i=1}^{K} x^i $$
Covariance matrix
We will need several results from linear algebra:
The log-likelihood function can then be rewritten as $$ \ell =-\frac{Kd}{2}\log(2\pi)+\frac{K}{2}\log|V^{-1}|-\frac{1}{2}\sum_{i=1}^{K}(x^i-\mu^i)(x^i-\mu^i)^TV^{-1} $$ By simmetry we end up with $$ \frac{\partial \ell}{\partial V^{-1}} = \frac{K}{2}V -\frac{1}{2}\sum_{i=1}^{K}(x^i-\mu^i)(x^i-\mu^i)^T = 0\\ \Longrightarrow \hat{V} = \frac{1}{K}\sum_{i=1}^{K}(x^i-\hat{\mu}^i)(x^i-\hat{\mu}^i)^T $$