Could someone explain it to me on why the equality only holds if and only if x = 0.
Wouldn't values of the set of all real numbers apply?
Could someone explain it to me on why the equality only holds if and only if x = 0.
Wouldn't values of the set of all real numbers apply?
As the comment above points out I think you misunderstood the question. The set of real numbers makes the inequality true. Now, for what values of $x$ is $x^2=0$. So, if $x=0$, then $x^2=0$ no problem. Now, if $x^2=0$, then $x^2=x \cdot x = 0$. Is there any other number multiplied by itself which equals zero (explain why).