If $f(a) < g(a)$ for certain $a$ then exists $\delta > 0$ so $f(x) \leq g(x)$ por every $x \in (a-\delta, a+\delta).$

31 Views Asked by At

If $f,g:\mathbb{R} \to \mathbb{R}$ are continuous, $a \in \mathbb{R}$ and $f(a) < g(a)$ then exists $\delta > 0$ so $f(x) \leq g(x)$ por every $x \in (a-\delta, a+\delta).$ I'm studying real analysis and I'm not sure on how to apply the continuity definition to solve this problem. Any hint would be appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

WLOG you can assume that $f\equiv0$ (if not, take $G(x)=g(x)-f(x)$).

Applying continuity at $g(a)$, given $\varepsilon=g(a)>0$, there is $\delta>0$ such that for $x\in(a-\delta,a+\delta)$ $|g(x)-g(a)|<\varepsilon=g(a)$. Hence $-g(a)<g(x)-g(a)<g(a)$ and, in particular, $g(x)>0$ for $x\in (a-\delta,a+\delta)$.