Ok so I have a function like this: $$f(t + 1, z_t + x_t)$$ where $x_t$ equals $z_{t+1} - z_t$.
I want to Taylor-expand this function so that I can get a first-order approximation. How do I do this? I want to expand around $x_t = 0$.
$z_t$ and $x_t$ are random variables, known at time $t$ and $t+1$ respectively, but just treat them as regular variables.