Integration/measure theory "paradox"?

136 Views Asked by At

I have encountered the following "paradox." Consider a dense countable subset of $\mathbb{R}$, e.g. $\mathbb{Q}$. Because the set is countable we may parametrise it by $\mathbb{Q} = \{ a_n \}_{n=1}^\infty$. Then consider the function (for some $\epsilon >0$) $$\sum_{n=1}^\infty \chi_{[a_n, a_n + \epsilon/2^n)} $$ where $\chi$ is the indicator function. Because the set $\mathbb{Q}$ is dense, this function converges to infinity everywhere. But its integral according to Lebesgue measure is $$\int_{\mathbb{R}} \sum_{n=1}^\infty \chi_{[a_n, a_n + \epsilon/2^n)} d\mu = \sum_{n=1}^\infty \int_{\mathbb{R}} \chi_{[a_n, a_n + \epsilon/2^n)} d\mu = \sum_{n=1}^\infty \frac{\epsilon}{2^n} = \epsilon$$ where we have commuted summation and integral using B. Levi's theorem on monotone convergence. Where is my mistake?

EDIT: Soon after posting this I realised that my intuition the function converges to infinity everywhere is wrong.

1

There are 1 best solutions below

0
On

As a funny other example, you might consider a function like : $$x\mapsto \sum_{n\in \mathbb N} \frac{\varepsilon_n}{\sqrt{|x-a_n|}}$$ you can easily find a sequence $(\varepsilon_n)_{n\in\mathbb N}$ such that this sum does converges...in $L^1$.