I have the following function $f(x)>0$: $$f(x)=g(x)(1-k(x))$$ $$\text{with } g(x)>0, 0<k(x)≤1 \text{ and } x \in \Bbb R^+.$$ I know that $g(x)$ is a linear and $k(x)$ a decreasing function with the following properties: $$g'>0 \text{ and } g''=0$$ $$k'>0 \text{ and } k''>0 \text{ or } k''<0 $$
My question is: Is it possible to prove that $f(x)$ has either no maximum or only one maximum (which is global)?
For this problem it seems quite intuitive to me there has to be only 1 maximum because $g(x)$ is increasing at a constant rate while $(1-k(x))$ is decreasing. Based on the fact that $(1-h(x))$ must be zero at some value for $x$, $f(x)$ can not increase forever and after it reaches a maximum it has to decrease; it is also possible that $f(x)$ is decreasing all the time.
Therefore I came up with another way of thinking about this problem. Based on the values for $g(x)$ and $k(x)$, $f(x)$ is either decreasing or is strictly concave over $x$. Therefore $f(x)$ is strictly-quasi-concave. This means that the following statement must be true: $$g(\lambda x_1 +(1-\lambda)x_2 )(1-k(\lambda x_1 +(1-\lambda)x_2 ))>\min(g(x_1)(1-k(x_1)); g(x_2)(1-k(x_2))$$ for $x_1<x_2$. We know that $g(x_1)<g(\lambda x_1+(1-\lambda)x_2)<g(x_2)$ because $g$ is constantly increasing with $x$. The same argument is also true $(1-k(x_1))<(1-k(\lambda x_1 +(1-\lambda)x_2 )<(1-k(x_2))$ because $(1-k(x))$ is monotonically decreasing with $x$. Therefore the product of $g(\lambda x_1 +(1-\lambda)x_2 )(1-k(\lambda x_1 +(1-\lambda)x_2 ))$ must be between $g(x_1)(1-k(x_1))$ and $g(x_2)(1-k(x_2)$. This means that the above statement must be true.
It holds $$f''(x) = -2g'(x)k'(x) - g(x)k''(x) < 0.$$ Hence $f$ is concave and allows only one maximum.