My question is somehow related to this question. What I would like to know is why we ignore the fact that integral of derivative of f(x) is equal to f(x) only up to a constant.
$$\int (\frac{d}{dx}[f(x)]) dx = f(x) + c$$
This has been very nicely answered in this question.
Derivation goes as follows in Pauls notes. Step 1. Start with the product rule.
$$\{uv\}' = uv' + vu'$$
Step 2. Integrate both sides.
$$\int \{uv\}' = \int udv + \int vdu$$
Step 3. Due to Fundamental Theorem of Calculus, part I, left hand side should simplify.
$$uv = \int udv + \int vdu$$
Step 4. From both sides subtract $\int vdu$
$$\int udv = uv-\int vdu ?$$
Now, what makes me confused is Step 3. It should simplify but it should also produce a constant. We could argue, that this constant will simplify with constants produced on the right hand side simply because of equivalency starting in Step 1, but my issue here is feels we are ignoring this constant and we don't keep track of them. This seems not so important on paper or on blackboard, but if we use a computer algebra system such as Sympy it will not recognise equation in Step 4 as true.
Indeed, it is so that $\int(uv)' = uv + c$ where $c$ is an arbitrary constant.
So by step three we would have: $uv= (\int u'v) + (\int uv') - c$
But wait! We intend that $\int u'v$ and $\int uv'$ will both eventually be evaluated as "something plus an arbitrary constant," and the sum of arbitrary constants is an arbitrary constant. So by convenience, we can pull any arbitrary constants into them until we are ready. That is okay if we ensure an arbitrary constant 'pops out' when we have evalated all indefinite integrals.