The question is as follows :
If the equation $ax^2 + bx + c$ has non real roots, prove that $1 + c/a + b/a > 0 $.
Looking at the question,the first thing that came to my mind was to use the relationship that the discriminant is less than 0 for non - real roots. But that wasn't of any help. How do I proceed?
If $a>0$, we have a parabola above the $x$-axis, hence $f(1)=a+b+c>0$.
On the other hand, $a<0$ implies $f(1)<0$. The given inequality follows.