Can we get simultaneous convergence of the integral using simple functions?

109 Views Asked by At

Let $Y$ be an integrable nonnegative random random variable on some probability space $X$, and Let $F:[0,\infty) \to \mathbb R$ be a continuous function.

Suppose that $F(Y) \in L^1(X)$. (I am fine with assuming $Y$ is bounded as well).

Do there always exist simple functions $Y_n \ge 0$ on $X$ such that

$$ E(Y)=\lim_{n \to \infty} E(Y_n), E(F(Y))=\lim_{n \to \infty} E(F(Y_n)) $$ both hold simultaneously?

Taking $Y_n$to be increasing, we get the first equality, due to the monotone convergence theorem. But if $F$ is not increasing, then $F(Y_n)$ won't necessarily be increasing.

2

There are 2 best solutions below

0
On BEST ANSWER

Under the assumption $Y$ is bounded: There exists a positive real $M$ such that $0\le Y \le M$ on $X.$ Let $Y_n$ be the usual simple functions approximating $Y.$ We then have $0\le Y_n\le M$ for all $X.$ Furthermore, $Y_n\to Y$ uniformly on $X;$ this you will recall holds because $Y$ is bounded. Now $F$ is uniformly continuus on $[0,M],$ and from this it follows that $F\circ Y_n\to F\circ Y$ uniformly on $X.$ The measure on $X$ is finite and the result follows.

0
On

You don't need $Y$ to be bounded. First, we can assume WLOG that $F$ is non-negative and $F(0)=0$. Let $(\varepsilon_n)$ be a sequence decreasing to $0$ and $A_n := \{Y \le n \}$. Since $F$ is uniformly continuous on $[0,n]$ for every $n$, there exists $\delta_n$ such that if $|x-y| < \delta_n$ and $0 \le x,y \le n$ we have $|F(x)-F(y)| < \varepsilon_n$.

Take $Y_n \ge 0$ to be a simple function such that $Y_n \le Y$, $Y_n = 0$ on $A_n^c$, and $|Y-Y_n| < \delta_n$ on $A_n$. Then we have

$$ 0 \le \mathbb{E}[Y-Y_n] = \mathbb{E}[(Y - Y_n)1_{A_n}] + \mathbb{E}[Y1_{A_n^c}] < \delta_n + \mathbb{E}[Y1_{A_n^c}]. $$

The first term goes to $0$ by our choice of $\delta_n$, and the second term goes to $0$ by the monotone convergence theorem, so $(Y_n) \rightarrow Y$ in $L^1$. Similarly, we compute

$$\mathbb{E}[|F(Y)-F(Y_n)|] = \mathbb{E}[|F(Y)-F(Y_n)|1_{A_n}] + \mathbb{E}[|F(Y)-F(0)|1_{A_n^c}] < \varepsilon_n + \mathbb{E}[|F(Y)|1_{A_n^c}]$$

and again the first term goes to $0$ by our choice of $\varepsilon_n$ and the second term goes to $0$ by the monotone convergence theorem.