If $f(x) = \dfrac1{x}-x$, prove that $f(x)$ is multiplicatively subadditive in $(0, 1)$, that is,
if $a, b \in (0, 1)$, then $f(a)+f(b) < f(ab)$.
If $f(x) = \dfrac1{x}-x$, prove that $f(x)$ is multiplicatively subadditive in $(0, 1)$, that is,
if $a, b \in (0, 1)$, then $f(a)+f(b) < f(ab)$.
On
Suppose we take $f(a)$ as $a$ approaches $0$. Then $f(a)$ approaches infinity. If we take $f(b)$ approaching $1$, then $f(b)$ approaches $0$. If we take $f(ab)$ with the same conditions, then ab should approach $0$ - but it is less than $f(a)$, which is important to note, because that means $f(ab)$ approaches infinity faster than $f(a)$ does.
To show one expression is larger than another, it is often helpful to consider the difference:
$$\begin{align} f(ab) - f(a) - f(b) &= \frac{1}{ab} - \frac{1}{a} - \frac{1}{b} + a + b - ab\\ &= \left(\frac{1}{ab} - \frac{1}{a} - \frac{1}{b} + 1\right) - \left(1 - a - b + ab\right)\\ &= \left(\frac{1}{a} - 1\right)\left(\frac{1}{b} - 1\right) - (1-a)(1-b)\\ &= \frac{(1-a)(1-b)}{ab} - (1-a)(1-b)\\ &= (1-a)(1-b)\left(\frac{1}{ab} - 1\right)\\ &> 0. \end{align}$$