I was working on the coding problem presented at http://codingbat.com/prob/p184004 and don't fully understand their solution.
Problem: Given an int n, return true if it is within 10 of 100 or 200.
Solution: true if $(|100 - n| \leq 10)$ OR $(|200 - n| \leq 10)$
I don't understand how taking the absolute value of the difference can check if the number is in a range? Can someone provide me with an intuitive explanation so I can recognize when to use this fact in the future? I guess it's like saying "check whether n differs from 100 value by 10", and then redo for 200.
If curious, my solution was this, though it's further from a pure math approach.
public boolean nearHundred(int n) {
if(n <= 210 && n >= 90) {
if(n >= 190 || n <= 110) {
return true;
}
} else {
return false;
}
}
If you'd written your code slightly differently you'd probably have spotted how the absolute value works! If instead you'd written it as
"if(n>=90 && n<=110) || (n>=190 && n<=210)..."you'd be replicating what the absolute value is doing.If we look at the $|100-n|<10$ part, let's see what happens for various values of $n$:
The absolute value (here) has the effect of throwing away the sign from the subtraction, so it converts a range expression like $90<= n <=110$ into a single inequality: $|100-n|<10$ We can rewrite any similar expression the same way: $a <= n <=b$ becomes $|(a+b)/2 - n| < (b-a)/2$.