After a discussion on the complexity of decimal expansion (such as $0.\bar{9}=1$), some of my students (middle school) decided to throw away the decimal expansion of some numbers! Namely, the numbers which their decimal expansion is not terminating.
I understand that their decision leads to some problems: We know that decimal expansions of many fractions are not terminating, and no irrational number can be represented as a terminating decimal expansion.
But what are the real disasters which occur if we limit the decimal expansions to the numbers with terminating ones? I look for answers which can be comprehended by students in middle school.
Thanks.
If you restrict to finite decimal expansions, then every number lives in the ring $\mathbb{Z}[1/10]$. I'm not sure it's a disaster, but it's definitely not the real numbers...