Let $f, g$ be Riemann-integrable functions at $[0,1]$.
Suppose that for each $a < b$ s.t $a, b \in [0, 1]$ there exists $t_1, t_2 \in [a, b]$ which satisfies $g(t_2) \leq f(t_1)$.
Prove that $\int_0^1 g(x)dx \leq \int_0^1 f(x)dx.$
If by the way of contradiction we have $\int_0^1 f(x)dx < \int_0^1 g(x)dx$ then I'd expect, by Darboux's integral definition (which we've already proved to be equivalent to Riemann's definition), that there exists some interval $[a, b] \subseteq [0, 1]$ with,
sup$_{x \in [a,b]} \{f(x)\}$ < inf$_{x \in [a,b]} \{g(x)\}$ and this would yield a contradiction to the existence of such $t_1, t_2$.
My main struggle, assuming this intuition is true, is to represent this mathematically using the definitions. I'd like to have some guidance, thanks.
For each natural number $n$, let us divide $[0,1]$ to $n$ equal intervals. For each $1\leq k\leq n$, there are $t^n_k, s^n_k\in [\frac{k-1}{n},\frac{k}{n}]$ such that $g(s^n_k)\leq f(t^n_k)$, therefore: $$\sum_{k=1}^n\frac{1}{n}g(s^n_k)\leq \sum_{k=1}^n\frac{1}{n}f(t^n_k) $$ Taking the limit as $n$ goes to infinity gives the desired result (as this is the definition of Riemann integral).