Let $R$ be the ring generated by $\{\sin, \cos, \sec\}$, closed under scalar multiplication by $\mathbb{R}$, and accompanied by a derivative operator $D$.
The following hold:
- $D(\sin) = \cos$
- $D(\cos) = -\sin$
- $D(\sec) = \sin \sec^2$
- $\sin^2 + \cos^2 = 1$
- $\cos \sec = 1$
Consider the equation $Dx = x$; it's a differential equation.
Let $S$ be the ring consisting of analytic real functions defined almost everywhere. I add in the defined almost everywhere constraint to cover the fact that $\sec$ is technically not a total function of the reals.
Within $S$, the solution to $Dx=x$ is $\{ r\exp : r \in \mathbb{R} \}$.
$0$ is in $R$ and is a solution to $Dx=x$.
I claim that it is the only solution in $R$.
We can prove this by noting that any unbounded function in $R$ approaches positive or negative infinity as its argument approaches $\frac{\pi}{2}$.
Whenever $r \neq 0$, $r \exp$ is unbounded. However, it doesn't approach positive or negative infinity as its argument approaches $\frac{\pi}{2}$, therefore it is outside of $R$.
Alternatively, we could prove this by noting that every function in $R$ is periodic, and the only periodic multiple of $\exp$ is $0$, since $\exp$ is not periodic as a real function.
However, solving this problem in this way required pulling in a lot of information about how real functions behave, rather than treating the functions abstractly.
Is there a way to prove this algebraically, by appealing to the ring structure of $R$, the linearity of $D$, and the facts that we know about $D$?
The usual way of solving $f'=f$ with integrating factors from analysis still works algebraically. Put slightly differently, whenever $Dx=x$, $$D(\frac{x}{\exp})=\frac{D(x)\cdot\exp - xD(\exp)}{\exp^2}=0$$ So $\frac{x}{\exp}$ is a constant. Here we only used three facts:
While it's not needed in this problem, often it's easier and more elegant to consider the differential field $\mathbb C(z, e^z, e^{iz})$, which can be regarded as a subfield of the meromorphic functions over $\mathbb C$ (and in your case, we may consider only $\Bbb C(e^{iz}, e^z)$ without $z$.). Note that $\sin, \cos, \sec$ can all be defined with $e^{iz}$ alone. Most ways to solve ODEs can be symbolically carried out in $\mathbb C(z,e^z, e^{iz})$.
Edit. Upon reading the formulation of the question again, it seems the OP is assuming $X:=\{r\exp \mid r\in\mathbb R\}$ are all the solutions in $S$, and the problem is whether $X\cap R=\{0\}$, which is not addressed above. Well, here is a straightforward approach to show there is no nonzero solution in $\mathbb C(x)$ where $x=e^{iz}$ and since $R\subset \mathbb C(x)$, we're done.
Indeed, given $\frac{p(x)}{q(x)}\in \mathbb C(x)\setminus\{0\}$, we have (based on $Dx=ix$ and chain rule) $$ \frac{p(x)}{q(x)}=D[\frac{p(x)}{q(x)}] = \frac{p'q-pq'}{q^2}Dx = \frac{p'q-pq'}{q^2} ix$$ $$pq = (p'q-pq')ix$$
If the leading term of $p$ and $q$ are $a_mx^m\not=0$ and $b_nx^n\not=0$ respectively , comparing the leading terms of the two sides above, we get $$a_ma_nx^{m+n}=(m-n)a_ma_nix^{m+n}$$ $$(m-n)i=1$$ which is impossible since $m-n\in\mathbb Z$.
In this approach, we have used the quotient rule and the chain rule $Dr(x)=r'(x)Dx$, which makes sense in a field when $r$ is a ratio of two polynomials and could be established from the Leibniz rule.
Further we can actually show that $x=e^{iz}$ and $y=e^z$ are algebraically independent over $\mathbb C$. Indeed if $$\sum_{m,n}a_{mn}x^ny^m=0$$ is a minimal algebraic relation between $x,y$, then by taking derivative on both sides, we get $$\sum_{m,n}a_{mn}(in+m)x^ny^m=0$$
Now we can cancell the leading term in $\sum_{m,n}a_{mn}x^ny^m$, contradicting the minimality, unless all $(m+ni)$ are equal, so there is a single term, but this is impossible as $x^my^n\not=0$.