I want to show that for $f(x) \geq 0$,
if $x^\star =\arg \max f(x)$ then $x^\star= \arg \max \sqrt {f(x)}$.
My attempt is to use the first derivative and second derivative tests and it seems that this is true.
However, I do not have convexity or monotonicity condition on $f(x)$.
Do this result holds for all $f(x)$ or there are any conditions?
For any monotonically increasing transformation, like $\sqrt{}$, the two objectives are equivalent. This idea is regularly used for maximum likelihood estimation, with a $\log$ transformation that instead maximizes log-likelihood.