I can't seem to find where my algebra is wrong in my assumption that x + 1/x < 2 can be simplified to (x-1)^2 < 0.
I start with x in the set of real numbers such that x + 1/x < 2 so I then subtract x from both sides leading to
1/x < 2 - x
I then multiply x to both sides to give me
1 < 2x-x^2
and then subtract one
0 < -x^2 +2x -1
I then add one to the other side and subtract 2x and add x^2 to get
x^2 -2x + 1 < 0
and then factor and simplify to
(x-1)^2 < 0
I am simply not seeing where I am breaking rules. I understand how to simplify this inequality I am more wondering why what I did in the above steps is invalid.
It has been a very long time since I have dealt with inequalities so that may be why I am not seeing what is probably an obvious mistake. Any help with this is greatly appreciated.
Edit: x is a real number and I understand that (x-c)^2 >=0 for all x and c in Real Numbers but I am wondering where my algebra went wrong to lead me to this contradiction.
Despite the existing solutions, since OP asks for why the calculations are wrong, I am going to write a few lines.
The logic is that the inequality is preserved when both sides are multiplied by a positive number; while it is reversed when they are multiplied by a negative number. $$a>0 \text{ and } b<c \implies ab<ac \\ a<0 \text{ and } b<c \implies ab>ac$$ You have to divide two cases. $$\begin{cases}1 < 2x-x^2 \qquad x>0 \\ 1>2x-x^2 \qquad x<0\end{cases}$$ Rewrite them as squares. $$\begin{cases}-(x-1)^2 < 0 \qquad x>0 \text{ rejected} \\ (x-1)^2 \qquad x<0 \text{ always true} \end{cases}$$ This allows us to conclude that $x<0$ is the necessary and sufficient condition for the inequality $x+1/x<2$.