I was solving a physics problem and I got $$\frac{a^2+b^2}{2ab}$$
or equivalents
$$\frac{1+ \left( \frac{b}{a}\right)^2}{2\cdot\frac{b}{a}}.$$
If $x=\frac{b}{a}$, we obtain
$$\frac{1+x^2}{2x}$$ or $$\frac{1}{2x}+\frac{x}{2}.$$
I need to know if this expresions are $>1$ when $a<b$ (or $x>1$). I know the answer is yes (I used geogebra to plot) but I'd like to know if there is any analytical way to get the answer. Thank you.
Note that $$0\leq (a-b)^2 = a^2 - 2ab + b^2$$ and hence $$a^2+b^2 \geq 2ab.$$ Therefore, assuming $ab>0$, we have $$ \frac{a^2+b^2}{2ab} \geq 1.$$