I’m going through a proof for Lebesgue’s Monotone Convergence Theorem.
It makes sense to me why $$\lim_{n \to \infty} \int f_n d\mu = \sup_{n\in \mathbb{N}}\int f_n d\mu \leq \int fd\mu$$ My problem lies in the opposite inequality. My textbook states that this is equvalent to showing $$I_{\mu}(s)\leq \sup_{n \in \mathbb{N}}\int f_n d\mu \quad \forall s \in SM(\mathcal{E})^+ : s \leq f$$ and that it is enough to show $$\alpha I_\mu(s) \leq \sup_{n \in \mathbb{N}}\int f_n d\mu$$ for any $\alpha \in (0,1)$. This part confuses me, why is it enough to look at $\alpha I_\mu(s)$.
Here, $SM(\mathcal{E})^+$ denotes the set of positive simple measurable functions.
If $x$ and $y$ are nonnegative real numbers, then $x\leq y$ iff $\alpha x\leq y$ for all $\alpha\in (0,1)$. Indeed, if $x\leq y$ is false, then $x>y$ so $\alpha x>y$ as long as $\alpha>y/x$, and we can pick such an $\alpha$ that is less than $1$ since $y/x<1$.