Why does using absolute value and difference check to see if number is in range?

106 Views Asked by At

I was working on the coding problem presented at http://codingbat.com/prob/p184004 and don't fully understand their solution.

Problem: Given an int n, return true if it is within 10 of 100 or 200.

Solution: true if $(|100 - n| \leq 10)$ OR $(|200 - n| \leq 10)$

I don't understand how taking the absolute value of the difference can check if the number is in a range? Can someone provide me with an intuitive explanation so I can recognize when to use this fact in the future? I guess it's like saying "check whether n differs from 100 value by 10", and then redo for 200.

If curious, my solution was this, though it's further from a pure math approach.

public boolean nearHundred(int n) {
  if(n <= 210 && n >= 90) {
    if(n >= 190 || n <= 110) {
      return true;
    }
  } else {
    return false;
  }
}
1

There are 1 best solutions below

0
On BEST ANSWER

If you'd written your code slightly differently you'd probably have spotted how the absolute value works! If instead you'd written it as "if(n>=90 && n<=110) || (n>=190 && n<=210)..." you'd be replicating what the absolute value is doing.

If we look at the $|100-n|<10$ part, let's see what happens for various values of $n$:

  • if $n=100$ then we get $|100-100| = 0 < 10$
  • if $n=105$ then we get $|100-105| = |-5| = 5 < 10$
  • if $n=85$ then we get $|100-85| = |15| = 15 \not< 10$

The absolute value (here) has the effect of throwing away the sign from the subtraction, so it converts a range expression like $90<= n <=110$ into a single inequality: $|100-n|<10$ We can rewrite any similar expression the same way: $a <= n <=b$ becomes $|(a+b)/2 - n| < (b-a)/2$.