A function is called convex downwards if the following inequality is satisfied:
$$ f(\lambda\cdot x_1 + (1-\lambda)\cdot x_2) \le \lambda\cdot f(x_1)+(1-\lambda)\cdot f(x_2), \; \text{where} \; \lambda \in [0,1] $$
Show that $f(x) = a^x$ is convex for $\forall x \in \mathbb R$.
I've been recently solving a similar problem for $f(x) = ax^2 + bx +c$ by plugging arguments from the inequality into the function and expanding the terms. Then some of them vanished and I ended up with an easy to handle inequality.
But that didn't work in case $f(x) = a^x$. I've tried to apply a logarithm to both sides but that didn't simplify the inequality. I also tried dividing both sides by various powers of $a$ since it's always greater than $0$, also no luck.
So I need to somehow show that $a^{\lambda\cdot x_1+(1-\lambda)\cdot x_2} \le \lambda \cdot a^{x_1} + (1-\lambda)\cdot a^{x_2}$. How can i do it?
It follows immediately by the generalized AM-GM inequality:
$$u^{\lambda}v^{1-\lambda}\le \lambda u+(1-\lambda)v$$
for any $u,v>0$ and $\lambda\in[0,1].$