When I was reading "Pattern Recognition and Machine Learning", I come across following equation (Equation 2.134, Chapter 2):$$ - \lim_{N \rightarrow \infty } \frac{1}{N} \sum_{n=1}^{N} \frac{\partial}{\partial \theta} \ln p(x_n|\theta) = \mathbb{E}_x \left[ - \frac{\partial}{\partial \theta} \ln p(x|\theta) \right]$$
I thought $\lim_{N \rightarrow \infty } \frac{1}{N} = 0$ and the left hand side equation becomes $0$. How does the limit in the LHS become expectation in the RHS?