Let us define the moment-generating function $\Gamma$ below:
$$\Gamma(\lambda) = \langle e^{\lambda x}\rangle = \int_{-\infty}^\infty dx \, e^{\lambda x} P(x)$$
where the angle brackets in $\langle e^{\lambda x}\rangle$ correspond to an average, $\lambda$ we treat as a constant and $x$ is a variable. $P(x)$ is the Gaussian distribution, which can be defined by
$$P(x) = \frac{1}{\sqrt{2\pi\sigma^2}} e^{-(x - \langle x \rangle)^2 / 2\sigma^2}$$
where $\sigma$ is the standard deviation and $\langle x \rangle$ is the mean of $x$.
I want to now define the form of $x$. If I have a sequence of random variables $x_1, x_2, x_3, \ldots, x_N$, I can require $x$ to have the form:
$$ x = \frac{1}{N} \sum_{i=1}^N x_i$$
If I put this $x$ into $\Gamma(\lambda)$,
$$\Gamma(\lambda) = \langle e^{\lambda x} \rangle = \langle e^{\frac{\lambda}{N} \sum_{i=1}^N x_i} \rangle$$
somehow, I am supposed to turn this result into
$$\langle e^{\frac{\lambda}{N} \sum_{i=1}^N x_i} \rangle = \langle e^{\frac{\lambda}{N} x} \rangle ^N$$
How do I prove this last line? How does the $N$ come out of the average?
If the $x_i$ are independent and identically distributed, the expectation $$\langle G(x_1 + x_2 + \cdots +x_n)\rangle$$ for some function $G(x)$ satisfying the property $G(x+y) = G(x)G(y)$ for all $x, y$, becomes $$\langle G(x_1)G(x_2)\cdots G(x_n)\rangle,$$ and due to independence, becomes $$\langle G(x_1) \rangle \langle G(x_2) \rangle \cdots \langle G(x_n) \rangle.$$ Since they are identically distributed, it follows that each expectation is equal, and the product is simply $\langle G(x_1) \rangle^n$.
In more traditional statistical notation, and with the choice of function $G = \exp$, we have for two independent (but not necessarily identically distributed) random variables $X_1, X_2$ $$M_{X_1+X_2}(t) = \operatorname{E}[e^{t(X_1 + X_2)}] = \operatorname{E}[e^{tX_1} e^{tX_2}] \overset{\text{ind}}{=} \operatorname{E}[e^{tX_1}]\operatorname{E}[e^{tX_2}] = M_{X_1}(t) M_{X_2}(t), $$ where $M_X(t)$ is the moment generating function of $X$. Then if $X_1$ and $X_2$ are identically distributed, their MGFs are equal.