subhomogeneity of continuous subadditive functions

44 Views Asked by At

Consider a non-decreasing continuous function $f:[0,\infty)\longrightarrow[0,\infty)$ with $f(0)=0$ and subadditive, i.e., $$f(x+y)\leq f(x)+f(y)$$ for all $x$ and $y$ in $[0,\infty)$.

I am interested in showing that $f$ is $a$-subhomogeneous for $1<a\in\mathbb{R}$, i.e., $$f(ax)\leq af(x)$$ for all $x$ in $[0,\infty)$.

If $a<1$, $f(ax)\not\leq af(x)$. For, taking $f(x)=\log(1+x)$ and $a=0.5$, we see that $f(ax)=0.438254930931155$ and $af(x)=0.370968672364689$.

For $a>1$, numerical comparison of known functions points to $a$-subhomogeneity.

Here is a partial answer to my problem by @https://stackexchange.com/users/4311758/kavi-rama-murthy.

Choose rational numbers $b,c≥1$ such that $b≤a≤c$. Then $f(ax)$ lies between $f(bx)$ and $f(cx)$. Since $f(bx)≤bf(x)→af(x)$ and $f(cx)≤cf(x)→af(x)$ as $b$ and $c→a$ we get $f(ax)≤af(x)$. Hence the inequality is true.

But I have problem showing that the inequality hold for rational numbers $a>1$.

Thanks for your suggestion and help in advance!