Hump function maximization

81 Views Asked by At

I am reading a paper and they mention a hump function maximization: I am trying to prove the point of maximization:

$m= (1-x)^{1-\sigma}x^\sigma$ where $x , \sigma \in [0,1] $

It is said that m is a hump-shaped function of x maximized at $x=\sigma$, where $\sigma$ is a parameter that is assumed to be fixed here.

My attempt:

First derivative with respect to x: $(1-x)^{-\sigma}x^\sigma+(1-x)^{1-\sigma}x^{\sigma-1} =0$

$1+(1-x)^{1}x^{-1} =0$

Second derivative with respect to x: $-(1-x)^{-\sigma -1}x^\sigma+ (1-x)^{-\sigma}x^{\sigma-1}-(1-x)^{-\sigma}x^{\sigma-1}+(1-x)^{1-\sigma}x^{\sigma-2}$

I couldn't get to the given result based on the above. Any clarification would be appreciated!

1

There are 1 best solutions below

0
On BEST ANSWER

If $x$ maximizes $m$ then it will also maximize

$$\log(m) = (1-\sigma)\log(1-x) + \sigma \log(x)$$

because $\log$ is strictly monotonic increasing.

So let us try to find the maximum by setting $0 = m'$ and trying to solve for $x$:

$$\begin{align} 0 &\overset{!}{=}\frac{\partial \log(m)}{\partial x} \\ &= (1-\sigma)\frac{1}{x-1} + \sigma \frac{1}{x} \\ \iff 0 &= (1-\sigma)x + \sigma(x-1) \\ &= x - \sigma \\ \iff x &= \sigma \end{align}$$

Now it should be easy to reason that $m$ is maximal at $x=\sigma$.