Finding a case in which the dominated converge theorem doesn't hold when ignoring it's assumptions

69 Views Asked by At

A exercise in a measure textbook I own asks: Give an example to show that the conclusion of the dominated convergence theorem (DCT) may cease to hold if we drop the domination assumption even if we additionally assume that $µ(X) < ∞.$ I'm unsure of how to find this result, here is what I've found so far:

The assumption in the DCT states there is an integrable $g: X→\mathbb{R}$ such that $|f_n(x)|≤g(x)$ for every $n\in\mathbb{N},$ hence without this we could say $|f_n(x)|= ∞$ for some $n\in\mathbb{N}.$ With this knowledge if I say $f_n(x)=1_X\frac n{n-1}$ then $\lim_{n\to ∞}\int_Xf_n(x)dμ=\int_X1_Xdμ$ which satisfies the theorem as $\lim_{n\to∞}f_n(x)=f(x)=1_X$? I'm confused as to what I've done wrong and what I could do to get the result the exercise is requesting, if I'm not on the right track here where would be a better place to start? Any help will be greatly appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

The intuition for how you can fail to have convergence of integrals when you do have a.e. convergence is either "compressing an area down to a null set" or "moving an area to infinity". In either of these cases you can have a.e. convergence to zero without the integrals going to zero. The latter is impossible with a finite measure space, but the former is still possible.