I was working through some past paper questions when this question came up. The question is as follows.
$f(x) = ax^2+bx+c$ are positive, and $a+b+c=1$. Prove that the inequality $f(x_1)f(x_2)...f(x_n)\geq1$ holds for all positive numbers $x_1,x_2...x_n$, satisfying $x_1x_2...x_n=1$
I tried proving that $f(x)\geq x$ for all positive $x$, which I was able to easily prove for all $x\geq 1$. However, I found it hard to prove that $f(x) \geq x$ for $0<x<1$. Can anyone give some thoughts on how to prove this question?
(Compare A Quadratic With Positive Coefficients or Inequalities with polynomials on AoPS). Any polynomial $f$ with real positive coefficients satisfies $$ f(x_1)\cdots f(x_n) \ge f(\sqrt[n]{x_1 \cdots x_n})^n $$ for positive numbers $x_1, \ldots, x_n$. That is a consequence of the generalized Hölder inequality, alternatively one can prove that $y \to \log(f(e^y))$ is convex and apply Jensen's inequality.
In our case is $x_1 \cdots x_n = 1$ and $f(1) = 1$, so that $f(x_1)\cdots f(x_n) \ge 1$.