Case 1:
Subtracting a percentage $p$ (represented as a number between 0 and 1) from a value $X$ results in the original value becoming $X(1-p)$.
Adding $ p\% $ back to it gives $\{X(1-p)\}*(1+p)$.
Case 2:
Now, adding $p\%$ to $X$ first, we get $X(1+p)$ and subtracting $p\%$ from this result gives $\{X(1+p)\}(1-p)$.
Of course, both of these are equal and the reason they are lower than X is because the result is $X(1-p^2)$.
This makes total sense to me mathematically, but how do you explain it to a child in an intuitive way?
In whichever order you go (increase followed by decrease; or vice versa) you are always subtracting a percentage from a larger amount than you are adding a percentage.
If you increase first, then the amount you are decreasing from is greater than the amount you increased from.
If you decrease first, then the amount you are increasing from is less than the amount you decreased from.
So in either case the amount you are decreasing from is greater than the amount you are increasing from; and since its the same percentage both times, you end up losing more than you gain.