The problem is:
If $f$ and $g$ are increasing, then is $f \cdot g$ also increasing?
First, I started out by working some basic definitions and assumptions:
- Assume $a < b$
- Increasing means $f(a) < f(b)$
Then I tried using case analysis:
Case 1: Assume $f(a), f(b), g(a), g(b)$ are all positive. Then:
$$f(a)g(a) < f(b)g(b)$$
which means $f \cdot g$ is increasing.
Case 2: Assume $f(a), f(b), g(a), g(b)$ are all negative. Then:
$$f(a)g(a) > f(b)g(b)$$
which means $f \cdot g$ is not increasing.
Case 3: Assume $f(a), f(b)$ are positive and $g(a), g(b)$ are negative. Then:
$$f(a)g(a) > f(b)g(b)$$
Means $f \cdot g$ is not increasing.
A few problems that I don't like with my approach:
The case analysis is quite cumbersome, and I still miss cases where the functions are equal to $0$.
There is no real proof here, I mostly tried with numerical examples and general logic. Is there an algebraic way to start with the inequality $f(a) < f(b)$ and then modify it so that we end up with $f(a)g(a) < f(b)g(b)$? I couldn't get anywhere with that approach though.
The statement is false and a single example will prove it. Take $f\colon\mathbb{R}\longrightarrow\mathbb R$ defined by $f(x)=-e^{-x}$ and take $g=f$. Then $f$ and $g$ are increasing. But $(f.g)(x)=e^{-2x}$ and therefore $f.g$ is decreasing.
Of course, this is an extreme example: both $f$ and $g$ are increasing, but $f.g$ is decreasing. If you consider the simpler example in which $f(x)=g(x)=x$, then $f.g$ is neither increasing nor decreasing. (Of course, as I did before, I am assuming that the domain of $f$ and of $g$ is $\mathbb R$).