It is the process of calculating the maximum likelihood estimate {$\pi_j$} in a multinomial distribution.
The multinomial log-likelihood function is
$l(\pi)=\sum_j{n_j}{log\pi_j}$
$\partial l(\pi) \over \partial \pi_j $= $n_j \over \pi_j$ - $n_c \over \pi_c$ $= 0$
$\sum_j\hat\pi_j = 1 =$ $\hat\pi_c(\sum_j n_j) \over n_c $ = $ \hat\pi_c n \over n_c$
$\hat\pi_c$ = $n_c \over n$
$\hat\pi_j$=$n_j \over n$
I don't understand why
$1$ = $\hat\pi_c(\sum_j n_j) \over n_c $
I'm looking for help. Thank you.
If we let the differential of $l$ be 0 and solve $\pi_j$ for $j\in\{1,\dots,c-1\}$ then $$\hat{\pi}_j=\frac{n_j\cdot \pi_c}{n_c},$$ also note that $\hat\pi_c=\frac{n_c\hat\pi_c}{n_c}$, therefore $$1=\sum_j\hat\pi_j=\hat\pi_c+\sum^{c-1}_{j=1}\hat\pi_j=\frac{n_c\hat\pi_c}{n_c}+\sum^{c-1}_{j=1}\frac{n_j\hat\pi_c}{n_c}=\sum_j\frac{n_j\hat\pi_c}{n_c}=\frac{\hat\pi_c\sum_jn_j}{n_c}.$$