Any hint for this measure theory problem from Halmos?

315 Views Asked by At

I was reading Halmos book 'Measure Theory' and I'm really stuck with this one. Could anyone please give me a hint?

Let $A\in\mathbb{R}$ be a Lebesgue measurable set and $D$ a dense subset of $A$. If $\mu(A\,\triangle\, A+d) = 0\;\;\forall d\in D$ where $\triangle$ denotes symmetric difference and $A+d$ is the set resulting of adding $d$ to each $a\in A$, it must be that $\mu(A) = 0$ or $\mu(A^c) = 0$.

I've tried thinking of A as an open interval and then using properties of Lebesgue measure, also tried to apply Lebesgue Density Theorem, with no success.

2

There are 2 best solutions below

7
On

Suppose both $A$ and $A^c$ have positive measure. Choose $x, y$ and $r > 0$ such that $A$ and $A^c$ have $99$% measure in $(x - r, x + r)$ and $(y - r, y + r)$ respectively. Choose $d \in D$ such that $|x + d - y| < r/100$. Now it is easily seen that $(A + d) \backslash A$ has positive measure.

0
On

All inclusions are modulo null. If $\mu(A) > 0$ and $D = \{d : A + d = A\}$. Then, $D$ is a closed additive subgroup of reals hence if it is proper it must be discrete and hence cannot be dense in $A$.