Avoiding the Integration Constant

656 Views Asked by At

I sometimes find writing and keeping track of the constants of integration a somewhat messy job. Yes, sometimes it's necessary but in many situations that I come across in my level of mathematics, it is a waste of time and space.

An exagerated example: $$f''(x) = g''(x)\\ \stackrel{\int\text{ing}}{\implies} f'(x) + C_1 = g'(x) + C_2\\ \stackrel{\int\text{ing}}{\implies} f(x) + C_3 + C_1x + C_4= g(x) + C_5 + C_2x + C_6\\ \implies f(x) = g(x) + C_ax + C_b $$ where $C_a = C_2 - C_1$ and $C_b = C_6 + C_5 - C_4 - C_3$

This just seems like a ridiculous amount of tracking in certain cases and I often just combine as many constants in one shot without mercy in most problems I face and never explain the sources of the constants (because it doesn't really matter)
$\times$ Some people I know avoid distinguishing the constants by just marking them all as $C$ and only distinguish between coefficients and constants as I've illustrated in the last step of my example.
$\times$ Some other students leave out the constants all together and write a " + C " only in last step but they don't realize that they are often neglecting terms where the constants turn into coefficients.

$\Large\star$ It isn't a mystery as to why the constant tickles people's lazy bones. The lethargic attitude almost everyone shows toward it is because that " + C " is just annoying. Sure, it may seem necessary but what about in simplifications?

$$ \begin{align} \int f'(x)\ \mathrm dx &= f(x) + C \\ &= f_1(x) + C \\ &= f_2 (x) + C \\ &= f_3 (x) + C \\ &= \dots \\ &= f_n(x) + C\\ \end{align}$$

where $f_{k\in \mathbb N}(x)$ is a simplified form of $f(x)$

It's usage is monotonous and seems absolutely unnecessary during simplification.

My question is hence this:

How can one safely hide the constant of integration during simplification of equations and what are the restrictions in such hiding?

Thank you in advance.

Edit: I am seeking a notation which would validly allow for a neglection of the constant during simplification of differential equations. Imagine 100 steps of simplification. The +C would be annoying. Apologies if this point was not clear through my rant.

8

There are 8 best solutions below

0
On BEST ANSWER

Use modular arithmetic:

$$\int f'(x) \, \mathrm dx \equiv f(x) \mod \Bbb R$$

But be warned, it has limitations during manipulation of the equation.

3
On

You could reduce the number of your constants by half by using only one constant each time you integrate. For example, when solving $$r'(x)=s'(x)$$ write the answer as $$r(x)=s(x)+C_1$$ rather than (as you were doing) $$r(x)+C_1=s(x)+C_2$$ Using only one constant when integrating both sides of an equation is perfectly rigorous and the proof is obvious.

ADDED:

1) Another possibility is to use constants like $c,d,e,\ldots$ (or $k,l,m,\ldots$ or $p,q,r,\ldots$ or $\alpha,\beta,\gamma,\ldots$) rather than $C_1,C_2,C_3,\ldots$. This does not reduce the number of constants but it does reduce the tiresome subscripts and make it look like there are fewer constants.

2) When you wrote "I often just combine as many constants in one shot without mercy..." I thought you meant things like changing $\pm C_1e^{x+C_2}$ to $C_1e^x$, which is less rigorous. Using only one constant per integration is a no-brainer.

3
On

I don't think that there is a standard way of doing it but of course, you can invent your on. In this case, an equivalence relation or rather a sequence of equivalence relations:

For functions $f, g$, let $f \equiv_d g$ if there exists a polynomial $p$ of degree at most $d$, such that $f = g +p$. In this case, you want $\deg 0 = -1$.

We have $f \equiv_{-1} g$ iff $f=g$ and integrating raises the degree $d$ by one.

Your example becomes $$\int f'(x) \operatorname{d}x \equiv_0 f(x) = f_1(x) = \dots = f_n(x)$$ and in a second step $$\int f'(x) \operatorname{d}x = f_n(x) + C.$$

Edit: In this answer, I defined the symbol $\equiv$ (if you don't like it, you can replace it by something else, say $\sim$). Another way to phrase the definition is the following: For functions $f, g$ and $d \in \{-1, 0, 1, 2, \dots \}$, let $f \equiv_d g$ if the difference $f - g$ is a polynomial of degree at most $d$. For every $d$, $\equiv_d$ is an equivalence relation – the proof is easy.

Slightly abusing notation (I will identify a function $f$ with its term $f(x)$ for brevity), we have for example

  • $f(x) \equiv_{-1} g(x)$ if and only if $f = g$, because the only degree-$(-1)$ polynomial is the zero polynomial, thus $f - g = 0$, i.e. $f = g$.
  • $\sin x \equiv_0 \sin x + 15$, because $15$ is a degree-$0$ polynomial
  • $f(x) \equiv_d f(x) + c_d x^d + c_{d-1} x^{d-1} + \dots + c_0$, because $c_d x^d + c_{d-1} x^{d-1} + \dots + c_0$ is a degree-$d$ polynomial

Now, whenever $f''(x) = g''(x)$, we know that $f'(x) + C = g'(x) + D$ with constants $C, D \in \mathbb{R}$. Thus, $f'(x) - g'(x) = D - C$ which is constant, i.e. a degree-$0$ polynomial. Using the notation defined above, we can write $f \equiv_0 g$, eliminating the explicit constant.

This also works for higher degrees. Say we know $f'(x) \equiv_d g'(x)$. Then there exists a degree-$d$ polynomial $p$ or more explicitly real numbers $c_0, \dots, c_d$, such that $f'(x) - g'(x) = p(x) = c_d x^d + \dots + c_0$. Integration: $$f(x) - g(x) = \frac{c_d}{d+1} x^{d+1} + \dots + c_0 x + C$$ which is a degree-$(d+1)$ polynomial. Thus $f \equiv_{d+1} g$.

The nice thing about an equivalence relation is that it behaves in many ways like equality. For example, you can use it in an equality chain like $$f_0(x) = f_1(x) \equiv_d f_2(x) = f_3(x)$$ if you keep in mind that you can only deduce $f_0(x) \equiv_d f_3(x)$ from that and not $f_0(x) = f_3(0)$.

Also note that this notation is not standard. If you want to use it and others to understand it, you will have to give the definition.

1
On

Here is one way around your problem:

$\int f'(x) \mathrm dx = f(x) + C$ for some $C$. So

$$\begin{align} \int f'(x)\ \mathrm dx - C &= f(x) \\ &= f_1(x) \\ &= f_2 (x) \\ &= f_3 (x) \\ &= \dots \\ &= f_n(x) \\ \end{align}$$

Therefore $\int f'(x) \mathrm dx = f_n(x) + C$ for some $C$.

I don't think this is exactly what you were looking for, but then I suspect that what you are looking for doesn't exist.

1
On

For the easy situation of one integration, you can write

$$ ∫f(x)dx ∋ F(x) = F_1(x) = … = F_n(x)\\ \text{hence} ∫f(x)dx = F_n(x) + C $$

It's because you can say that primitive operator function really gives you the set of all primitive functions, all of which differs by a constant. Then, to be consistent, you can say that the second equality is a set equality and $C$ is a set of all constants.

1
On

Because a sum of constants can be any combination of constants, then we can substitute $\sum\limits_{i=0}^\infty C_i = C_1$

Similarily, $x(\sum\limits_{i=0}^\infty C_i) = C_2$.

We have then, $\int\int\int\int\int\int\int dx$ having 6 constants, but that is unavoidable. We can forcefully combine constants with the same factor, because a constant+constant2 is just a constant.

1
On

In certain circumstances, Landau notation (aka Big-O notation) can be used for this. So, for example, your first example would look something like: $$ f''(x) = g''(x)\\ \stackrel{\int\text{ing}}{\implies} f'(x) + O(1) = g'(x) + O(1)\\ \stackrel{\int\text{ing}}{\implies} f(x) + O(x) = g(x) + O(x)\\ \implies f(x) = g(x) + O(x) $$

and your second example would look like (maybe not such a great improvement):

$$ \begin{align} \int f'(x)\ \mathrm dx &= f(x) + O(1) \\ &= f_1(x) + O(1) \\ &= f_2 (x) + O(1) \\ &= f_3 (x) + O(1) \\ &= \dots \\ &= f_n(x) + O(1)\\ \end{align} $$

The caveat would be that this should probably only be used when you don't care too much about what is elided by the Big O (e.g.: $x + \frac{1}{log(x)} = O(x)$). And, yes, there can be some slipperiness with the use of the equals sign with Big O notation in these contexts (see the Wikipedia section on this) and if you're not careful you might end up tripping yourself up. But, I think if your uses are mainly like your first example, I think some sparing and/or careful use of Big O's might help.

4
On

My solution to this problem (nuisance rather) that I actually make extensive use of is very similar to @EikeShulte's answer.

For an ordinary differential equation, suppose you have the following:

$$\int f'(x)dx=f(x)+C$$

This can be written in 'Don Juan' notation as

$$\int f'(x) dx = {}^0f(x)$$

Which translates as "f(x) has a polynomial of degree 0 after it with unknown coefficients". If you were to integrate that 30 more times and obtain a function $g(x)$ with, as you may have guessed, a polynomial of degree $30$ after with unknown coefficients, you could write it very simply with 'Don Juan' notation as

$${}^{30}g(x)$$

And for what other purposes does anybody really use that pre-superscript? I don't know of any, so I think this notation solves the problem pretty nicely.

But what if you know what $f(x)$ or $g(x)$ actually are with the exception of the polynomial of unknown coefficients? No problem! Suppose $g(x)$, the same one mentioned above, has $\sin(3x) + 3x^2-e^x$ before the polynomial. Then it can be written in 'Don Juan' notation as

$${}^{30}(\sin(3x) + 3x^2-e^x)$$