Let $f(x)$ an increasing function of $x$ and let $g(x)$ a decreasing function of x.
If $x^\star$ maximizes $f(x)\left[g(x)-c\right]$, where $c>0$, is it true that $x^\star$ decreases in $c$?
Thanks!
Let $f(x)$ an increasing function of $x$ and let $g(x)$ a decreasing function of x.
If $x^\star$ maximizes $f(x)\left[g(x)-c\right]$, where $c>0$, is it true that $x^\star$ decreases in $c$?
Thanks!
Copyright © 2021 JogjaFile Inc.
This boils down to the following question: Let $F(x)$ be any function (say, $f(x)g(x)-cf(x)$) and $H(x)$ be an increasing function (say, $(\Delta c)f(x))$. Let $x$ be a global maximum of $F$ and $y$ be a global maximum of $F-H$ (assuming they exists). Then we must have $F(x)\ge F(y)$ and $F(x)-H(x)\le F(y)-H(y)$ whence $H(y)\le H(x)$, so, once $H$ is increasing (strictly), we get $y\le x$. That's pretty much all you can say in the generality in which you have asked the question. So, indeed, $x^*$ is non-increasing in $c$ (though not necessarily strictly decreasing unless you have some smoothness).