I was working a statistical mechanics's problem on a biased coin to find the variance of the probability of getting heads on $N$ throws:
When a biased coin is flipped the outcome is heads with probability $p$ and tails with probability $1 − p$. If this coin is flipped $N$ times, the probability that the total number of heads is $n$ is: $$p(n)=\begin{pmatrix} N\\ n \\ \end{pmatrix}p^n(1-p)^{N-n}$$
The most likely value of $n$ is $n = pN$, but there are fluctuations about this most likely value. Denote $n = Np + s$, and suppose that $N \gg 1$. In this limit, $p(n)$, regarded as a function of $s$, approaches a Gaussian with mean zero and some variance $\sigma^2_p$; hence,
given
$$\ln p(n)= \text{constant}-\frac{s^2}{2\sigma^2_p}+O(s^4) \tag 1$$
where “constant” means a term independent of $s$. Calculate $\sigma^2_p$ using the Stirling approximation and the approximations $s \ll pN$ and $s \ll (1−p)N.$
I have worked the standard way, and got the correct answer $\sigma^2_p=Np(1-p)$. However, there is another quicker way to do this:
observe $(1)$ and for a large $N$ we have: $$\partial_n^2\ln p(n)=-\dfrac 1 {2\sigma^2}$$
from the left hand side we have:
$$\partial_n^2 \ln p(n) \approx \ln p(n+1)-\ln p(n)- \ln p(n) + \ln p(n-1) = \frac{-1}{2\sigma^2_p}$$
after some work, then we have: $\sigma^2_p= \frac{n(N-n)}{N}$
Then here comes what I don't understand :
This expression should be evaluated at $n = pN$,
hence $\sigma^2_p=Np(1-p)$
Why this expression should be evaluated at $n = pN$? This is very unclear to me.