Solving Lagrange equation with linear constraint

41 Views Asked by At

Rolls of die with unknown bias:
To find the maximum likelihood solution, we maximize the product of the log-likelihood for each individual data point $x_i \in {1,2,...,6}$ with respect to the parameters $\lambda_{1...6}$

L = $\sum_{k=1}^6 N_k log[\lambda_k] + \nu \Big(\sum_{k=1}^6 \lambda_k - 1 \Big)$

where $\nu$ is a lagrange multiplier for the criterion $\sum_k \lambda_k = 1$

Differentiating L with respect to $\lambda_k$ and $\nu$ and setting the derivatives equal to 0 and solving for $\lambda_k$ should lead to:

$\hat{\lambda}_k = \frac {N_k}{\sum_{m=1}^6 N_m}$

However I can't see how. Differentiating with respect to $\lambda_k$ leaves

$\frac {N_k} {\lambda_k} + \nu \stackrel{!}{=} 0$

and with respect to $\nu$ gives me

$\sum_{k=1}^6 \lambda_k - 1 \stackrel{!}{=} 0$

First problem is that I cannot replace $\nu$ in the first equation as it doesn't appear in the second equation anymore.

1

There are 1 best solutions below

1
On BEST ANSWER

If you substitute $\lambda_k=-\frac{N_k}{\nu}$ into the second equation you can solve for $\nu$ and get the expected result.