I've been thinking about the following. I know from the law of large numbers that if $X_1,X_2,...$ is an infinite sequence of independent and identically distributed copies of integrable $X:\Omega\to\mathbb{R}$, then
$$\lim_{n\to\infty} \frac{1}{n} \sum_{i=1}^n X_i = \mathbb{E}[X]$$
Suppose we have a sequence of convex functions $f_1, f_2, ...$ converging pointwise to $f:\mathbb{R}\to\mathbb{R}$ from below i.e. $f_i \leq f$. Assume that $f_i(X_i)$ and $f(X)$ are integrable. Then is it true that
$$\lim_{n\to\infty} \frac{1}{n} \sum_{i=1}^n f_i(X_i) = \mathbb{E}[f(X)]$$
holds?
Edit: I added convexity and integrability in the above.
My attempt at some theorems:
Theorem 1: $\lim_{n\to\infty}\frac{1}{n}\sum_{i=1}^n\mathbb{E}[f_i(X_i)] = \mathbb{E}[f(X)]$
Proof: We know that the negative part of the function i.e. $\min\{0,f_i\}$ is Lipschitz continuous. Otherwise the epigraph of $f_i$ would not be convex. Using this and the fact that $f_i \leq f$,
$$|\frac{1}{n}\sum_{i=1}^n f_i(X)| \leq \sup_{i=1,..,n}|f_i(0)| + C||X|| + |f(X)| $$
where $C = \sup\{C_1,C_2,...\}$ where $C_i$ Lipschitz constant for $\min\{0,f_i\}$. Note that $C \leq \infty$ since $\min\{0,f_i\}$ converges pointwise to $\min\{0, f\}$ which is Lipschitc continuous. Now using the dominated convergence theorem, we have:
$$\lim_{n\to\infty}\frac{1}{n}\sum_{i=1}^n\mathbb{E}[f_i(X)] = \mathbb{E}[f(X)]$$
It is clear that $\mathbb{E}[f_i(X_i)] = \mathbb{E}[f_i(X)]$ and substituting this in the above limit gives the desired result. QED.
Theorem 2: $\lim_{n\to\infty} \frac{1}{n} \sum_{i=1}^n f_i(X_i) = \mathbb{E}[f(X)]$ with probability one.
Proof: Define sequence $h_i(X_i) = f(X_i) - f_i(X_i)$. Then $\lim_{n\to\infty} \frac{1}{n}\sum_{i=1}^n f_i(X_i) + h_i(X_i) = \mathbb{E}[f(X)]$ with probability one. To see this, note that $f_i(X_i) + h_i(X_i)$ is iid copy of $f(X)$. Using the strong law of large numbers gives the result. Now we want to show $\lim_{n\to\infty} \frac{1}{n}\sum_{i=1}^n h_i(X_i) = 0$ with probability one. Observe that $\lim_{i\to\infty} h_i(X_i) = 0$ almost surely because $f_i$ converges pointwise to $f$. Therefore, $\lim_{n\to\infty} \frac{1}{n}\sum_{i=1}^n f_i(X_i) + 0 = \mathbb{E}[f(X)]$ with probability one. QED.