When are algebraic expressions equivalent?

1.2k Views Asked by At

This question arose when I was going to determine the domain for $f \circ f(x)$.
Let $f(x) = \dfrac{1-x}{1+x}$.
$f \circ f(x) = x, \quad$ But the domain is not $\mathbb{R}$ because $f(x)$ is undefined for $x = -1$.

This made sense to me after some thinking, $f \circ f(x)$ is only equivalent to $x$ if $x \neq -1$, so really one needs to specify the domain.

So far so good, but then I started questioning the way I usually simplify algebraic expressions and solve equations. If I were to solve the equation
$p_1(x) = q_1(x)$
I might start simplifying both expressions and come up with a simpler equation:
$p_1(x) = q_1(x) \iff p_2(x) = q_2(x) \iff ... \iff p_n(x) = q_n(x)$
and here any $q_k(x)\iff q_t(x)\quad$ right?

So I tried to come up with cases when I felt I probably would not bother with specifying domains before simplifying an expression, but probably would if the expression was used in an equation, as an example:

$\dfrac{1}{x} + x - \dfrac{1}{x} = 0$

So usually I feel like I have always just regarded the left hand side as equivalent to $x$, if just asked to simplify the left hand side as an expression. But the equation makes me confused. Obviously $\dfrac{1}{x}$ is not defined when $x = 0$, but then again $\dfrac{1}{x}$ and $-\dfrac{1}{x}$ cancel eachother out, so does it really matter that they are not defined when $x = 0$?

I tried to consult wolframalpha, but if I plug in $x = x + \dfrac{1}{x} - \dfrac{1}{x}$, it evaluates to true and if I plug in $x + \dfrac{1}{x} - \dfrac{1}{x} = 0$ it doesn't give an answer, as it does if $0$ is substituted for any real number.

So now I feel like I've failed to understand a very fundamental concept of mathematics, that is when algebraic expressions really can be seen as equivalent and if it matters if they are used in equations or not?

3

There are 3 best solutions below

8
On BEST ANSWER

You've come across a good point (that many students miss). Namely, that when dealing with compositions of functions, the domains are very important (and must be maintained in the final answer).

For the problem of $f\circ f$, since $f$ is not defined at $x=-1$, you must avoid situations where $x=-1$ or where $f(x)=-1$. Since there are no solutions to $f(x)=-1$ (if there were, such $x$'s would need to be removed from the final domain), this means that your domain must only avoid $x=-1$, so it is all real numbers except for $x=-1$. After simplification, one finds that $f\circ f(x)=x$, but this hasn't changed the domain, so we would say that $f\circ f(x)=x$ for all real numbers except for $x=-1$ (so the domain is part of the answer).

In some situations, one would replace $f\circ f$ by $x$ and extending the domain; this is because the singularity at $x=-1$ is removable, namely, there is another function that agrees with $f\circ f$ whenever $f\circ f$ is defined and has a larger domain. In analysis, this is thought of in many ways and can come up with equivalence classes of measurable functions (since $f\circ f$ and $x$ agree except on a measure zero set) or with analytic continuations in complex analysis (analytic continuations extend the domain of a function).

Your second example, $x=x+\frac{1}{x}-\frac{1}{x}$, we consider these as algebraic expressions (and not as functions). The rules are different when the objects are just algebraic expressions because we're not plugging in for $x$. In general, we define two fractions $\frac{a}{b}$ and $\frac{c}{d}$ to be equal when $ad=bc$ (by cross multiplication). In algebraic expression, we don't substitute for $x$ (like in a formula for a function), so we don't worry about when the denominator vanishes.

In conclusion:

  • Algebraically: $x=x+\frac{1}{x}-\frac{1}{x}$.
  • In Functions: $g(x)=x$ and $h(x)=x+\frac{1}{x}-\frac{1}{x}$, but $g\not=h$ because the domains are different.
  • In functions: Let $f(x)=\frac{1-x}{1+x}$, $f\circ f(x)=x$ when $x\not=-1$.
  • Algebraically: $f\circ f(x)=x$
  • Analytically: $f\circ f$ can be extended to a larger domain by $x$.

  • 0
    On

    The key is that $\frac{1-x}{1+x}$ doesn't make sense when $x=-1$.

    So $f(f(-1))=f(\mathrm{undefined})=\mathrm{undefined}$.

    When simplifying, $$f(f(x))=\frac{1-\frac{1-x}{1+x}}{1+\frac{1-x}{1+x}}=\frac{1-\frac{1-x}{1+x}}{1+\frac{1-x}{1+x}}\cdot\frac{1+x}{1+x}=\frac{1+x-1+x}{1+x+1-x}=\frac{2x}{2}=x$$

    But multiplying by $\frac{1+x}{1+x}$ is only valid when $x\neq-1$. So we can only simplify it if $x\neq-1$.

    0
    On

    It is important to be aware that Wolfram Alpha is a machine, not a source of ultimate truth. You can usually trust its calculations, but for concepts and understanding of what you're doing, you're better off trusting your own understanding. In the particular case of $x+\frac1x-\frac1x$, Wolfram Alpha makes the guess that you're probably not interested in what happens at the isolated points where the original expression is not defined, so it presents a simplified expression that is equivalent to the original one when the original expression is defined. That doesn't represent any deeper mathematical truth; it is just what the programmers felt would probably be most useful for most users.

    It is a fact that the left-hand side of $x+\frac1x-\frac1x=x$ cannot be evaluated when $x=0$, and so the equation is not true when $x=0$. The question is then, should you care? Only you can decide whether it's worth caring about, based on what use you're going to make of the equation and the result. In many cases the "removable singularity" at $x=0$ is completely irrelevant to what you're actually doing, and it is then reasonable to forget about it as quickly as possible. In other cases, it may be a sign that things are going to be hinky around $x=0$ -- perhaps some effects that you thought you could neglect when you came up with $x+\frac1x-\frac1x$ are actually not negligible for small $x$, and you need to go back to the drawing board and do a more careful analysis. It all depends on where the expression $x+\frac1x-\frac1x$ comes from.

    If you're in a classroom setting and have been given the expression $x+\frac1x-\frac1x$ without any context and asked to simplify it, it is usually the case that you will be expected to pretend to care about the trouble at $x=0$, in the absence of any particular reason not to care. But note that this is a highly artificial situation -- when you actually use what you've learned there will always be some context that you should keep in mind to determine whether the non-definedness at $x=0$ is a relevant fact for you or not.