I've been trying to work through the following problem.
Let $\mu$ be a measure defined on $(\mathbb{R},\mathscr{L})$, where $\mathscr{L}$ is the $\sigma$-algebra of all Lebesgue measurable subsets of $\mathbb{R}$. Suppose that there exists a $K<\infty$ such that $\int_{\mathbb{R}}e^{nx}~d\mu(x)\leq K$ for each $n\in\mathbb{Z}^{+}$. Prove that $\mu((0,\infty))=0$.
I'm really not sure what I should be aiming for here. I was thinking of defining a sequence $f_{n}(x)=\int_{\mathbb{R}}e^{nx}~d\mu(x)$ for each $n\in\mathbb{Z}^{+}$ and possibly seeing where one of the convergence theorems takes me? However, I don't see how this would recover the measure of the given interval. Any help is greatly appreciated!
The key idea here is that $e^{nx} \to \infty$ as $n \to \infty$ for any fixed $x$, and does so uniformly on any interval separated from $0$.
So if we consider an interval $I = [a, b]$ with $a > 0$, we get $$K \ge \int_{\mathbb{R}} e^{nx} \, d\mu(x) \ge \int_I e^{nx} d\mu(x) \ge e^{an} \mu(I)$$
for every $n$. What can you conclude from this?