Given a probability space $(\Omega, \mathcal{F}, P)$ and a continuous random variable $X: \Omega \to I$ where $I$ is an interval of $\mathbb{R}$. I'm trying to show that the expected value $$E[X] = \int\limits_{\Omega} X \, dP$$ must be in $I$.
That is seemingly trivial, but I'm struggling with the case where $I = \left\{ x : a < x < b \right\}$ (i.e. without equalities). While $$a \leq \int\limits_{\Omega} X \, dP \leq b$$ is obvious. I still cannot figure out how to prove $$a < \int\limits_{\Omega} X \, dP < b$$.
By upwards continuity of measures, there must exist some $c<b$ such that $P(X\leq c)\geq \frac{1}{2}$
Split $\Omega$ into the events $(X\leq c)$ and $(X>c)$ and integtate over each of those seperately. You can bound X by $c$ on one of the integrals and $b$ on the other, so we see:
$$EX\leq P(X\leq c)c+ P(X>c)b\leq\frac{c+b}{2}<b$$
You can use an entirely similar strategy to show that $EX>a$