Prove that if $a, b \in \mathbb{R}$ are positive, then $a \leq b$ implies $\sqrt a \leq \sqrt b$.
I have seen proof for this problem at a couple of places. People have used the property $0 \leq a-b = (\sqrt a + \sqrt b)(\sqrt a - \sqrt b)$ and dividing this by $(\sqrt a + \sqrt b)$ gets the result. But I don't think this is the right proof because if even though $a,b$ are positive but square roots of positive numbers could be negative. So, $(\sqrt a + \sqrt b) $ could be negative as well. So, dividing by $(\sqrt a + \sqrt b)$ could also lead to $\sqrt a \geq \sqrt b $. Am I missing something here?
You are right since by your hipothesis $a,b> 0$ and $b-a> 0$ but $b-a=(\sqrt{b}-\sqrt{a})(\sqrt{b}+\sqrt{a})> 0$ and now the square root of a positive number is positive and then $(\sqrt{b}+\sqrt{a})\neq 0$ as well and hence $\sqrt{b}-\sqrt{a}> 0$ wich complete the proof.
Notice that if $a=b=0$ the inequality holds.