I am working on a problem of modelling a rubber molecule as a one-dimensional chain consisting of $N=N_{+}+N_{-}$ links, where $N_{+}$ points in the positive $x$-direction a distance $a$ and $N_{-}$ points in the negative $x$-direction a distance $a$.
It is trivial to prove that:
$$L = a(N_{+} - N_{-})$$
Where $L$ is the overall length of the rubber molecule. We can also see that the number of ways of arranging the links to achieve a length $L$ is given by:
$$\Omega(L)=\frac{N!}{N_{+}!N_{-}!}=\binom{N}{N_{+}}$$
However, I am asked to prove that the entropy (defined by $S = k_{B}\ln(\Omega(L))$) can be written approximately as:
$$S\approx N k_{B}\left[\ln(2)-\frac{L^{2}}{2N^{2} a^{2}}\right]\tag{1}$$
By Stirling's approximation we have:
$$k_{B}\left[-N\left(\frac{N_{+}}{N}\ln\left(\frac{N_{+}}{N}\right)-\left(1-\frac{N_{+}}{N}\right)\ln\left(1-\frac{N_{+}}{N}\right)\right)\right]$$
But I cannot see a way of relating this to the approximation for entropy given. I tried a Taylor expansion of the logarithms but the algebra quickly became messy and didn't provide anything which looked like $(1)$.
$$ S = k_B \ln{N \choose N_+} \approx k_B N \,H\left(p \right)$$
where $H(p)$ is the binary entropy function (in nats) and $$p=\frac{N_+}{N}=\frac{1}{2}\left(1+\frac{L}{aN}\right)$$
Now, if we can do the (additional) assumption $\frac{L}{aN} \ll 1$, we can do a Taylor expansion of $H(p)$ around $p=1/2$, so that $$H(p)\approx \ln 2 - \frac{(1-2p)^2}{2 }= \ln 2 - \frac{L^2}{2 (aN)^2} $$
Alternatively, as pointed out in the comments, one could use the CLT approximation of the Binomial distribution to a gaussian, so $$2^{-N}{N \choose N_+} \approx \sqrt{\frac{2}{ \pi N}} \exp{\left(-\frac{2(N_+-N/2)^2}{N}\right)}$$
$$ \log {N \choose N_+} \approx N \left(\ln 2 - 2\left(\frac{N_+}{N}-\frac{1}{2}\right)^2\right ) = N \left( \ln 2 - \frac{L^2}{2 (aN)^2}\right)$$