Probabilistic Proof That An Absolutely Continuous Function is Differentiable Almost Everywhere

410 Views Asked by At

Consider the probability space $([0,1), \mathcal{B}, \lambda)$ where $\mathcal{B}$ is the Borel $\sigma$-algebra and $\lambda$ is the uniform measure. Let $A_{i,n} = [(i-1)2^{-n}, i2^{-n})$ for $i \in \{1, 2, \dots, 2^n\}$ and $n \in \{0, 1, 2, \dots\}$. Let $\mathcal{F}_n = \sigma(A_{i,n} : i \in \{1, 2, \dots, 2^n\})$. Let $x(\cdot)$ be an absolutely continuous function on the unit interval and let $h_{i,n} = 2^n(x(i2^{-n}) - x((i-1)2^{-n}))$. Define $X_n = \sum_{i=1}^{2^n} h_{i,n} \mathbb{1}_{A_{i,n}}$.

I have already shown that $\{X_n\}$ is a $\mathcal{F}_n$-martingale. The next steps are:

(i) Show that $\{X_n\}$ is uniformly integrable.

(ii) There exists an integrable $h:[0,1) \rightarrow \mathbb{R}$ such that $x(t) - x(s) = \int_s^t h(u)du$ for all $0 \leq s \leq t < 1$.

(iii) Using Lebesgue's Theorem, conclude that $\frac{dx}{dt} = h$ a.e.

I know that once I have (i), $X_n$ will converge a.s. and in $L^1$ to $X_\infty$. I can use this to get (ii) and then use (ii) to get (iii). I know it is simple, but I'm stuck on (i).

1

There are 1 best solutions below

3
On BEST ANSWER

For brevity, we write $\Delta_{i,n} := x(i 2^{-n})-x((i-1) 2^{-n})$. $x$ is absolutely continuous; hence, in particular of bounded variation and therefore

$$c := \sup_{n \in \mathbb{N}} \sum_{i=1}^{2^n} |\Delta_{i,n}| < \infty.$$

In order to prove the uniform integrability, we have to show

$$\sup_{n \in \mathbb{N}} \mathbb{E}(|X_n| \cdot 1_{|X_n|>r}) \leq \varepsilon \tag{1}$$

for $r=r(\varepsilon)$ sufficiently large. By definition of $X_n$,

$$ \mathbb{E}(|X_n| \cdot 1_{|X_n|>r}) = \sum_{i: 2^n |\Delta_{i,n}|>r} |\Delta_{i,n}|. \tag{2}$$

On the other hand, we have

$$2^{-n} \text{card}\{i: 2^n |\Delta_{i,n}| > r\} \leq r^{-1} \sum_{i=1}^{2^n} |\Delta_{i,n}| \leq r^{-1} c. \tag{3}$$

Since $x$ is absolutely continuous, there exists $\delta>0$ such that

$$0 \leq s_0<t_0<\ldots<s_k<t_k \leq 1, \sum_{i=0}^k (t_i-s_i) < \delta \Rightarrow \sum_{i=0}^k |x(s_i)-x(t_{i})| < \varepsilon. \tag{4}$$

If we set $t_{i,n} := \frac{i}{2^n}$, then $(3)$ shows

$$\sum_{i: 2^n |\Delta_{i,n}|>r} |t_{i,n}-t_{i-1,n}| \leq 2^{-n} \text{card} \{i; 2^n |\Delta_{i,n}|>r\} <\delta$$

for $r \geq r_0:= \frac{c}{\delta}$. Hence, by $(4)$,

$$\mathbb{E}(|X_n| \cdot 1_{|X_n|>r}) =\sum_{i: 2^n |\Delta_{i,n}|>r} |\Delta_{i,n}| < \varepsilon.$$

for $r \geq r_0$. As $r_0$ does not depend on $n$, this finishes the proof.