Suppose that $0 <a,b < 1$ and $a+b=1$
Today, I did some investigation of the expression $$a^{\frac{1}{b}}+b^{\frac{1}{a}}$$ and it seems that the maximum is at $a=b=\frac{1}{2}$ where the expression is equal to $\frac {1}{2}$.
I would like to conjecture that we have $$a^{\frac{1}{b}}+b^{\frac{1}{a}}\leq \frac {1}{2}$$
Is this true?
Consider the concave function $f(t) = \sqrt[t]{1-t}$ (shown below.) Using Jensen’s inequality,
$$f(a) + f(b) \leqslant 2f(\tfrac12) = \tfrac12$$
—-
To show $f$ is concave, it is enough to show $g=\log f$ is concave, as $t\mapsto e^t$ is convex and increasing. Perhaps the easiest way for this is to note the Taylor series for $x \in (0, 1)$ for $g= -\sum_{n \geqslant 0} \frac{x^n}{n+1} $ which implies all coefficients of $g’’$ are going to be negative as well $\implies g’’<0$.