Fisher information matrix of the binomial distribution

184 Views Asked by At

The definition of Fisher matrix from the book "Methods of information geometry" by Shun-Ichi Amari is as follows: let $S=\{p_{\xi}\}$ be an $n$-dimensional statistical model, where $\xi$ is a parameter $\xi\in E\subseteq {\mathbb{R^n}}$. The Fisher information matrix is the $n\times n$ matrix given by $G(\xi)=[g_{ij}(\xi)]_{n\times n}$, where $g_{ij}(\xi)=E_{\xi}[\partial_{i}\ell_{\xi}\partial_{j}\ell_{\xi}]$; $E_{\xi}$ denotes the expectation with respect to the distribution $p_{\xi}$, and we denote $\partial_{i}=\frac{\partial}{\partial\xi^i}$ and $\ell_{\xi}=\log(p(x,\xi))$.

I want to calculate the Fisher information matrix for binomial distribution for which we have the distribution function given by $P(x,\xi)= {n \choose x}p^x(1-p)^{n-x} : x=0,1,2,\dots,n$.

My doubt is: I think here we have two parameters $n,p$ but I have seen the solutions where people are calculating the Fisher information by assuming only $p$ as a parameter. Can someone explain why? Because according to me we will get a $2\times 2 $ Fisher information matrix but in the solution people are getting a $1\times 1 $ matrix.