We have a function $f(t)$ that is defined for all $t \in [0,T]$. From our data, we can estimate two important parameters $\theta \in \mathbb{Z} \backslash 0$ and $n \in \mathbb{N}^{+}$. Given $n$, we can also estimate a function $g(n)$. Our estimations are fairly accurate.
We also found from our data sets that at $t = T$, we have $g(n) = (\partial_t + \theta)^n f(t)$ where $\partial_t$ is a time-derivative operator. That is, given $n$, at $t=T$, $g(n)$ is a combination of derivatives of $f(t)$.
Now knowing $g(n)$ for every $n$, how can we determine $f(t)$ explicitly, if even possible?
Unfortunately, we don't have sufficient initial conditions to solve the above DE directly. All we know is that $f(T) = 0$.
Can anyone give some hints please? Can we write $f(t)$ in terms of its Taylor series?
You are on the right track with the taylor series. Let $g(0):=f(T)$, then the equations $$ g(n)=(\partial_t+\theta)^nf(t)|_{t=T}=\sum_{i=0}^n{n\choose i}f^{(i)}(T)\theta^{n-i} $$ for $n\geq0$ can be solved iteratively, which gives: $$ f^{(i)}(T)=\sum_{j=0}^i{i\choose j}g(j)(-\theta)^{i-j} $$
This can now be plugged in the the taylor series which yields $$ f(t)=\sum_{i=0}^\infty f^{(i)}(T)\frac{(t-T)^i}{i!}=\sum_{i=0}^\infty \sum_{j=0}^i{i\choose j}g(j)(-\theta)^{i-j}\frac{(t-T)^i}{i!} $$
edit:
This solution is more general and works for arbitrary $f(T)$. For your special case of $f(T)=0$ the solution reduces to $$ f(t)=\sum_{i=1}^\infty \sum_{j=1}^i{i\choose j}g(j)(-\theta)^{i-j}\frac{(t-T)^i}{i!} $$ which looks similar but the indices start at $1$.