When comparing two quantities, say $x_1, x_2$ or something casually, I frequently use the phrase "they're within X%" but I use it rather loosely because mathematically, to compute the percentage difference it depends on whether you use $x_1, x_2$ as the reference.
e.g., suppose $x_1=90, x_2=100$, the absolute percentage difference is either $$ \frac{10}{100} $$ or $$ \frac{10}{90} $$
If I was asked "check if $x_1, x_2$ are within $10\%$", is there some idiomatic interpretation for this? I think I would naturally want to choose the interpretation with the larger denominator (hence smaller percent differential).
It depends upon the direction of change. For example, suppose a stock price goes from $\$90$ to $\$100$. Then the change is $\frac{10}{90}\approx 11\%$.
If the stock price goes from $\$100$ to $\$90$ then the change is $-\frac{10}{100}=-10\%$.
In the first case we would say that the stock price increased by approximately $11\%$ and in the second, we would say the stock price decreased by $10\%$.
To say that the difference between two numbers is a certain percentage doesn't really make sense. We start from a quantity and measure its change.