Suppose that there exists functions $f$ and $g$ defined on the real numbers and differentiable everywhere. If their derivatives $f'$ and $g'$ are linearly independent on some nonzero interval then are $f$ and $g$ linearly independent on the same interval? Similarly if $f$ and $g$ are linearly independent then are $f'$ and $g'$ linearly independent?
I am trying to determine whether it might be more efficient to prove that^ then to prove $f$ and $g$ have linearly independent derivatives for every $f$ and $g$ I can think of.
Edit: I did heavily alter this question. However, the questions are $100$% equivalent. It is just that using the term "linear independent" rather than "there does not exist real numbers $c$ and $d$..." is easier to read and much clearer.
No. For instance, $f=0$ and $g(x)=x$ are linearly dependent (because $f+0g=0$), yet if $cf'(x)+dg'(x)=0$ for any $x$, then $d=0$.
On the other hand, if you meant "there aren't $c,d$ which are not both zero" rather than "there aren't $c,d$ which are both non-zero", then the implication is true (on the interval of interest), because linear independence of the derivatives means that no linear combination of the functions by non-trivial coefficients can be constant. Of course, said technique wouldn't manage to prove that, say, $\arctan$ and $\operatorname{arccot}$ are linearly independent.