If $m(A)=0$, does this imply that $m(\log(A)) = 0$ when $A\subset \mathbb{R}_{+}$

61 Views Asked by At

The question is as follows: Let $A\subset \mathbb{R}_{+}$ and let $\log(A)=\{\log(t): t\in A\}$.If $m(A)=0$,then is it true that $m(\log(A)) = 0$? If $m(A) < \infty$, then is it true that $m(A) < \infty$? Here $m$ refers to the Lebesgue measure.

For part (a), I think it is true. \begin{align*} m(\log(A)) &= \int 1_{\log(A)}(t) dt \\ &= \int 1_A(e^t) dt \\ &= \int \frac{1_A(u)}{u} du \\ &= \int_A \frac{1}{u} du = 0 \end{align*} since $m(A) =0$ and $A \subset \mathbb{R}_{+}$.

Part (b) I think is false. My counterexample is the set

$$ A = \bigcup_{n\geq 1} (\frac{n}{2^n}, \frac{n+1}{2^n}) $$

which has $m(A) = 1$ but $m(\log(A)) = \infty$.