I am not even sure what I mean here, saying "simplification". I have been taught this for many years, but I realize I actually have no idea.
For instance, consider:
$$ \tag1 \frac {\sqrt{x} + \sqrt{y}} {\sqrt{x} - \sqrt{y}} $$
I can "rationalize" this expression into something like this:
$$ \tag2 \frac {(\sqrt{x} + \sqrt{y})^2} {x - y} $$
Is it simplified now? Or complicatified?
Let me speak of a mathematical expression as a syntax tree, and of some collection of trivially sound transformations, like addition of zero, as trivial symmetries of that tree. A moderately involved expression has a lot of independent trivial symmetries that may be arbitrarily combined, possibly giving rise to the option of applying other, previously unavailable symmetries (so I do not immediately see a group here). But that is only half the trouble.
Notice that, to pass from $(1)$ to $(2)$, I need to complicate the expression first, when I multiply by:
$$ \frac {\sqrt{x} + \sqrt{y}} {\sqrt{x} + \sqrt{y}} = 1 $$
I also have to apply a formula that is not trivial in that it is not listed among the field axioms:
$$ (x + y)(x - y) = x^2 - y^2 $$
— And how I am supposed to maintain the list of such "handy" formulae is also not clear. So what we have here is a search problem where the space (of expressions) has several local minima (by the measure of the size of the expression tree), and, in order to get from one to another, I need to jump walls of some height. As the walls grow higher, the size of the hillside to be explored grows somewhat exponentially. Notice that the hillside grows (exponentially) when I deepen the search, but it also grows (polynomially) when I add more "shortcut" formulae.
Given all that, how can I be sure the expression $(1)$ does not simplify further, for example to $1$? How can I make any kind of such statement with respect to an arbitrary expression, or some class of expressions?
I can surely show that $(1)$ will not simplify to 1, or to any other constant.
Let $(1) = f(x, y)$. Suppose that $\forall x \space y \enspace f(x, y) = c$, that is, $f = \text{const} \space c$.
Then we have an equation:
$$ \frac {r + s} {r - s} = c \quad \left(r = \sqrt{x}, s = \sqrt{y} \right) $$
— which can be transformed, separating variables, to $ (c - 1)r = (c + 1)s $. Since not both $c + 1$ and $c - 1$ are $0$, we may divide by either, gaining two possible solutions:
$$ r = \frac {c + 1} {c - 1} \quad \text{or} \quad s = \frac {c - 1} {c + 1} $$
Therefore, we now see that $f(x, y) = c$ only for a restricted choice of $x$ and $y$, so our assumption cannot be true.
$\square$
From this example arises a method: we can equate the given expression with an expression of some class we suspect it may simplify to, and if we may obtain any additional restriction on the variables, we know it does not. If we cannot obtain a restriction though, then we may know nothing.