If I was trapped on a desert island and needed to compute $\log(2)$, the natural logaritm of $2$, one thing I could do is use the equality
$$\log(2) = \int_1^2 \frac{1}{x} \ dx$$
and approximate the definite integral with Simpson's rule. Say I want to partition $[1,2]$ into $10$ subintervals to do this. But, maybe I don't like working with fracftions very much. Well, then I could use instead $$ \log(2) = \int_{10}^{20} \frac{1}{x} \ dx$$ since it is easier to divide $[10,20]$ into $10$ subintervals.
Now, it happens that using Simpon's rule approximation with $n=10$ gives the same approximation to both of these definite integrals. You can try this here; just puch in
f(x): 1/x From: 1 To: 2 Amount: 10
and then
f(x): 1/x From: 10 To: 20 Amount: 10
and note the results are the same. My intuition for why this happens is a bit murky. I don't think it would be difficult to justify this with formulas, but I feel as though there is probably something about areas and scale and the geometry of the situation, something which I am missing, which would make this clear. Maybe someone here would like to give a clear explanation? Thanks.
This happens because $x$ is linear in $x$. An approximation of the first integral could be $(1/1 + 1/1.1 + \dots + 1/2) \times 1/11$ where the corresponding approximation of the second integral would be $(1/10+1/11+\dots+1/20) \times 10/11$.