This question is related to this one. Given two functions $f,g\colon \mathbb{R}^n \to \mathbb{R}$, and given that at a certain point $x_0 \in \mathbb{R}^n$ we have $f(x_0) \geq g(x_0)$, and given an open subset $A \subseteq \mathbb{R}^n$ containing $x_0$, on what conditions on the partial derivatives of $f$ and $g$ on $A$ can I assure that $f(x) \geq g(x)$ for every $x \in A$? When is this inequality strict?
This question came when I was trying to prove that for all $(x,y) \in \mathbb{R}^2$ and $(x,y) \neq (0,0)$, it holds that $x^2 + y^2 > xy$. Somehow, it seemed visually obvious that the function $x^2 + y^2$ grew faster than $xy$, and so they could never catch up again after the origin, but I found difficulties when trying to formalize this thought.
Also, is there a name for this result/test/condition, even if only on the one variable case?
While dodging your actual question, I will provide you this explanation to prove that $x^2+y^2\ge xy$.
We know that the root-mean square is bigger than or equal to geometric mean. $$\sqrt{\frac{x^2+y^2}{2}} \ge \sqrt{xy}$$ Clearly, $$x^2+y^2\ge 2xy > xy$$ assuming $x$ and $y$ are positive.
To give a small idea for your question, I can say if $f$ is bigger than $g$ at a point and then if the derivative of $f$ is bigger than derivative of $g$ everywhere (actually, after that point is enough), then you can conclude $f$ is bigger than $g$ everywhere after that point.