I am reading Explaining the Gibbs Sampler. What I have understood so far is that this sampler allows us to generate $X_1, \ldots, X_m$ with density $f(x)$ without actually knowing what $f(x)$ is.
The author says that if we wanted to calculate the mean of $f(x)$ we could use this fact,
$$\lim_{m \rightarrow \infty} \frac 1 m \sum_{i=1}^m X_i = \int_{-\infty}^\infty x f(x) \, dx = \operatorname EX.$$
I've never seen the first equality in any text when the topic of expectation of a random variable is introduced.
How exactly is the first equality true?
This is the Strong Law of Large Numbers, which says that the empirical mean converges to the true mean.