Let's say I have an arbitrary elementary function $f : \mathbb{R} \to \mathbb{R}$, is there an upper bound on the number of functions that satisfies the following? $$\frac{d}{dx}\left(\int f(x) \ dx\right) = f(x) \ \ \ \ \ \ \ \ \ \ \ \ \ \text(1)$$
For example if I have $\int e^{x}\ dx$, there seems to be only one function that satisfies $(1)$, that being $e^{x}$. However if I have $\int \frac{1}{\sqrt{4-x^2}} \ dx$, there are at least two functions that satisfy this, those being $\sin^{-1}(\frac{x}{2})$ and $-\cos^{-1}(\frac{x}{2})$.
So let $S$, be the set of all functions satisfying $(1)$ above :
$$S = \left\{f : \frac{d}{dx}\left(\int f(x) \ dx\right) = f(x)\right\}$$
What is $|S|$?
- Is |S| finite or infinite?
- Is there any way to determine if $|S| = 1$ (i.e. there is only one possible anti-derivative for a function?
- Does $|S|$ vary for different classes of elementary functions (e.g. polynomial, rational, trigonometric, exponential, or more generally, transcendental or algebraic)
I ask this particular question as for some functions such as $\int x^2 dx$, it seems like there can only ever be one function satisfying $(1)$, but I've had the same feeling about other functions, but using another technique of integration for those functions often yields more than one valid function (anti-derivative) satisfying $(1)$, and I have no way of telling the difference.
Is there a rigorous answer/definition in line with what I'm asking? If it goes into topics in Analysis, I'm all ears as currently it seems a bit hand-wavy to me, as if someone was saying: 'Sometimes you find more than one anti-derivative, sometimes you don't, eh what you gonna do about it?'
It's unclear what question you're asking - I believe you are over-using "$f$." If you are asking how many antiderivatives a given function has, consider the following:
This shows that any function which has an antiderivative, has infinitely many (in fact, continuum many). In particular, $e^x$ is not the only antiderivative of $e^x$ - there's also things like $e^x+17$.
In the other direction, suppose $G'=H'=f$. Then what can you say about $(G-H)'$? What does that tell you about the function $G-H$ (under reasonable niceness assumptions on $G$ and $H$ - in particular, you want to assume here that the domain of $G$ and $H$ is all of $\mathbb{R}$, or at least is connected)?
Note that this isn't always obvious - for example, $\sin^{-1}({x\over 2})$ and $-\cos^{-1}({x\over 2})$ appear to not differ by a constant, but in fact they do.