I'm reading a paper by Edelman and Kostlan (https://arxiv.org/abs/math/9501224, page 8 at the bottom) about real random polynomials for my bachelors thesis. He uses the identity mentioned in the title.
Although I think it is probably very elementary to show this, I couldnt figure out, why it holds. My idea was to use the logarithm and the taylor expansion of $ln(1+x)$. I would get \begin{align} \ln \left( \left(1+\frac{x}{n} \right)^n \right)&=n \cdot \ln \left(1+\frac{x}{n} \right)\\ &= n \cdot \left(\frac{x}{n} - \frac{x^2}{2n^2}+\frac{x^3}{3n^3}+ \dots \right)\\ &= x-\frac{x^2}{2n} + O \left(\frac{1}{n^2} \right). \end{align}
But now I get \begin{align} (1+\frac{x}{n})^n &= e^{\ln \left( \left(1+\frac{x}{n} \right)^n \right)}\\ &= e^{x-\frac{x^2}{2n} + O \left(\frac{1}{n^2} \right)}\\ &= e^{x \cdot \left(1-\frac{x}{2n}+O \left(\frac{1}{n^2} \right) \right)}, \end{align} which is not at all what I wanted. I also tried using Taylors theorem but could not find a function that would work. I also tried using the binomial theorem, but it ended in a mess (maybe there is a certain way of index shifting that I missed). I would greatly appreciate help or a hint.
What you did is right, now you continue by expanding $e^y$ around $y=x$. You have $e^{y+\Delta y}=e^y + e^y \Delta y + O(\Delta y^2)$. Replace $y$ with $x$ and $\Delta y$ with $-x^2/(2n)+O(1/n^2)$.