Differentiation of an infinite series of functions

806 Views Asked by At

I want to show:

If

$$f(x) :=\sum \limits_{n=0}^{\infty} g_n(x)$$

where the series is convergent for every $x \in [a,b]$ and $g_n(x)$ is a nondecreasing function on $[a,b]$ for each $n$ ; then

$$f'(x) = \sum \limits_{n=0}^{\infty} g_n'(x)$$ almost everywhere.

Thoughts:

Since $g_n$'s are nondecreasing, $f$ is monoton and by Lebesgue's related theorem $f$ is differentiable a.e. on $[a,b]$. The nondecreasing property is crucial, without that I can state counter examples.

Statement obviously true for finite case. However I couldnt pass to the limit. I am totally stuck here. Could you please help me on this one?

1

There are 1 best solutions below

3
On BEST ANSWER

This is Fubini's theorem on differentiation.

Hint: Let $S_n(x) = \sum_{i=1}^n g_i(x)$. Except on a null set, we have $$ S'_{n-1}(x) \leq S'_n(x) \leq f'(x) $$ which implies that $\lim_{n\rightarrow \infty} S'_n(x) = \sum_{n=1}^\infty g'_n(x)$ converges almost everywhere. Now try to construct a subsequence $S'_{n_k}(x)$ converging to $f'(x)$, and it's done.

You may need to use monotonicity multiple times in the proof.