I am trying to derive the fisher information matrix for multinomial distribution. I know the pmf for it is:
$$f(x_1,x_2,..x_k;n,p_1,p_2,..p_k) = \frac{\Gamma(\sum_ix_i+1)}{\prod_i\Gamma(x_i+1)}\prod_i^k{p_i}^{x_i}$$
To simplify the calculation instead of calculating $I(n,P)$ I can calculate $nI(1,P)$: $$f(x_1,x_2,..x_k;1,p_1,p_2,..p_k) = \prod_i^k{p_i}^{x_i}$$ $$ I(n,P) = E[\frac{\partial ^2}{\partial P^2} \log(f(X,P))|P]$$
I am not getting the right answer with this calculation and I think the problem is I am not including the equality constraint $\sum_i p_i = 1$ within the pmf. I would appreciate if someone can help me with the derivation.