I have a very simple question about the following proof so you may not need to read all of this in order to answer the question which is right at the end; although I had to include the entire word for word copy of the question and solution to give the necessary context:
When doing a maximum likelihood fit, we often take a ‘Gaussian approximation’. This problem works through the case of a measurement from a Binomial distribution: $$L(n;p,N)=\cfrac{N!}{n!(N-n)!}p^n(1-p)^{N-n}$$ $p_0$ is defined to be the value of $p$ which gives the maximum of the likelihood, at which the likelihood has a value $L_0$.
$\fbox{$\color{purple}{\text{The objective is to find the value of the standard deviation $\sigma$ for this Gaussian approximation:}}$}$ The solution goes as follows:
Taking natural logs gives $$\ln(L) = n\ln(p) + (N-n)\ln(1-p)- \ln\left(\frac{N!}{n!(N-n)!}\right)$$ The first derivative is $$\frac{\mathrm{d}\ln(L)}{\mathrm{d}p}=\frac{n}{p}-\frac{N-n}{1-p}$$ The maximum $p_0$ is given when $\cfrac{\mathrm{d}\ln(L)}{\mathrm{d}p}=0$ $$\implies \frac{n}{p_0}=\frac{N-n}{1-p_0}\implies n-np_0=p_0N-p_0n\implies \fbox{$\color{blue}{p_0=\frac{n}{N}}$}\tag{1}$$ Taking the second derivative gives $$\frac{\mathrm{d^2}\ln(L)}{\mathrm{d}p^2}=-\frac{n}{p^2}-\frac{N-n}{(1-p)^2}$$ The Taylor expansion to second order is $$\ln(L)\approx\ln(L_0) + \frac{\mathrm{d}\ln(L)}{\mathrm{d}p}\bigg|_{p=\color{blue}{p_0}}\times\frac{\left(p-\color{blue}{p_0}\right)}{1!} + \frac{\mathrm{d^2}\ln(L)}{\mathrm{d}p^2}\bigg|_{p=\color{blue}{p_0}}\times\frac{\left(p-\color{blue}{p_0}\right)^2}{2!}$$ $$\implies\ln(L)\approx\ln(L_0) + {\left(\frac{n}{\color{blue}{p_0}}-\frac{N-n}{1-\color{blue}{p_0}}\right)}\left(p-\color{blue}{p_0}\right)-{\left(\frac{n}{\color{blue}{p_0}^2}+\frac{N-n}{(1-\color{blue}{p_0})^2}\right)}\frac{(p-\color{blue}{p_0})^2}{2}$$ $$\require{enclose}\implies\ln(L)\approx\ln(L_0) + \enclose{updiagonalstrike}{\underbrace{\left(\frac{n}{\color{blue}{\frac{n}{N}}}-\frac{N-n}{1-\color{blue}{\frac{n}{N}}}\right)}_{=0}\left(p-\color{blue}{\frac{n}{N}}\right)}-\frac12{\left(\frac{n}{\left(\color{blue}{\frac{n}{N}}\right)^2}+\frac{N-n}{\left(1-\color{blue}{\frac{n}{N}}\right)^2}\right)}\left(p-\color{blue}{\frac{n}{N}}\right)^2$$ $$\implies\ln(L)\approx\ln(L_0) -\frac12{\left(\frac{N^2}{n}+\frac{N^2}{N-n}\right)}\left(p-\frac{n}{N}\right)^2$$ $$\implies\ln(L)\approx\ln(L_0) -\left(\frac{N^3\left(p-\frac{n}{N}\right)^2}{2n(N-n)}\right)$$ $$\implies\ln\left(\frac{L}{L_0}\right)\approx -\left(\frac{N^3\left(p-\frac{n}{N}\right)^2}{2n(N-n)}\right)$$ $$\implies L\approx L_0\exp{\left(-\frac{\left(p-\frac{n}{N}\right)^2}{\left(2\times\frac{n(N-n)}{N^3}\right)}\right)}$$ So $$\sigma=\sqrt{\frac{n(N-n)}{N^3}}$$ This does not at first sight correspond to the standard deviation obtained from the binomial distribution. This is because that is in terms of $n$, whereas the result derived above is in terms of $p$. $\color{red}{\text{However, since in the approximation used here}}$ $\color{red}{p\approx\frac{n}{N}}$ $\color{red}{\mathrm{,}}$ $\color{red}{\text{then the standard deviation in terms of $n$ will be $N$ times the above, i.e}}$ $$\color{red}{\sigma =N\sqrt{\frac{n(N-n)}{N^3}}} =\sqrt{\frac{n(N-n)}{N}}=\sqrt{N\frac{n}{N}\left(1-\frac{n}{N}\right)}=\sqrt{Np\left(1-p\right)}$$ as required.
I have completely understood all of the above proof apart from the part marked $\color{red}{\mathrm{red}}$.
First of all; $p_0=\frac{n}{N}$ as given in $(1)$ so I would like to know the justification as to why $\color{red}{p\approx\frac{n}{N}}$?
Secondly, what on earth is the author talking about when they say that the standard deviation
"$\color{red}{\text{will be $N$ times the above}}$"?
Since when is multiplying $\sigma$ by $N$ justified?
It was not given that $p_0 = \frac{n}{N}$, it was shown using the maximum likelihood method. That is the justification. Since this is MLE, it makes sense that $p\approx p_0$.
It looks like he is saying $$\sigma^2 =\text{Var}(p) = \text{Var}(p_0) = \text{Var}\left(\frac{n}{N}\right) = \frac{1}{N^2}\text{Var}(n).$$ This implies $\text{Var}(n) = N^2\text{Var}(p)$, and finally $$\text{SD}(n) =N\times\text{SD}(p) = N \sigma. $$