Under what conditions $f(x)$ and $\sqrt{f(x)}$ have same optimal points?

52 Views Asked by At

I want to show that for $f(x) \geq 0$,

if $x^\star =\arg \max f(x)$ then $x^\star= \arg \max \sqrt {f(x)}$.

My attempt is to use the first derivative and second derivative tests and it seems that this is true.

However, I do not have convexity or monotonicity condition on $f(x)$.

Do this result holds for all $f(x)$ or there are any conditions?

2

There are 2 best solutions below

0
On BEST ANSWER

For any monotonically increasing transformation, like $\sqrt{}$, the two objectives are equivalent. This idea is regularly used for maximum likelihood estimation, with a $\log$ transformation that instead maximizes log-likelihood.

0
On

In fact you don't even need differentiability of $f$. You just need $f(x) \geq 0$ to make sense $\sqrt{f(x)}$.

To see, let $f(x^*) = \max\{f(x): x \in X\}$ then $f(x) \leq f(x^*)$ for all $x \in X$. If $\sqrt{f(x^*)} \neq \max\{ \sqrt{f(x)} : x \in X\}$ then there is some $x_0 \in X$ such that $\sqrt{f(x^*)} < \sqrt{f(x_0)}$. Squaring both sides gives $f(x^*) < f(x_0)$ which is a contradiction. In a same manner, you can also show that if $\sqrt{f(x')} = \max\{\sqrt{f(x)}: x \in X\}$ then $f(x') = \max\{f(x): x \in X\}$.