let $a,b \in R$
Prove that if $3 \lt a \lt 5$ and $b= 2 + \sqrt{a-2}$ then, $3 \lt b \lt a$
My approach was simply to start with the first inequality and transform it into b and see what happens.
Subtracting 2 gives: $1 \lt a-2 \lt 3$
Taking the square root gives: $1 \lt \sqrt{a-2} \lt \sqrt{3}$
Adding two gives: $3 \lt 2 + \sqrt{a-2} \lt \sqrt{3}+2$
thus $3 \lt b \lt \sqrt{3}+2$
So the interval on b is in fact smaller than the interval on a but that just doesn't seem like it is enough. That's not a very convincing argument. Isn't it plausible that b could equal 3.11 and a could equal 3.10 or something like that. Those numbers fall in the inside the allowed intervals.
Plus this method didn't work on the other 4 questions in the problem.
Suppose $ a \gt 2$ and $b = 1 + \sqrt{a-1}$ using the same technique lands
$b$ as $b \gt 2$
You are asked to prove that if $3<a<5$ and $b=2+\sqrt{a-2}$ then $3<b<a$.
You proved $3<b$ (and $b<\sqrt3+2$), but you didn't prove $b<a$.
$b<a$ means $\sqrt{a-2}+2<a$ or $\sqrt{a-2}<a-2$.
This suggests letting $x=a-2$ and proving $\sqrt x<x$.
Well, if $x>1$, which is true since $a>3$, then $x^2>x>0$ so $x>\sqrt x$.
That means $a-2>\sqrt{a-2}$ or $b=\sqrt{a-2}+2<a.\quad$ QED