Say we have a random variable $z(t) \sim \Pr(z(t)\mid\lambda(t))$ where $\lambda(t)$ are the parameters of the distribution.
Is there a way we can analytically compute $\Pr(\dot{z}(t))$ where $\dot{z}(t) = \frac{dz}{dt}$ using the parameters $\lambda$? (Any reading on the the subject would also be appreciated -- I feel I'm missing the necessary vocab. to search around properly!)
Here is a counter-example where information about the marginal CDFs of $Z(t)$ at each time $t$, namely $$F_t(z) = P[Z(t)\leq z] \quad \forall z \in \mathbb{R} , \forall t \in \mathbb{R}$$ does not tell you info about time dependence and hence does not give you info about $Z'(t)$.
Counter-example: Fix $m \in \mathbb{R}$ and define $U \sim Uniform[0,1]$ and $$ Z(t) = (U + mt) \mod 1 \quad \forall t \in \mathbb{R}$$ Then $Z(t)$ is uniformly distributed over $[0,1]$ for each $t \in \mathbb{R}$, but the slope satisfies $Z'(t)=m$ for almost all times $t$ (except for a set of times that is at most countably infinite). You can make $m$ anything you like. Thus, knowing the marginal CDFs at each time $t$ does not tell you information about the derivative.
For $m \neq 0$, this counter-example also shows an example where for each $t \in \mathbb{R}$: $$ \underbrace{\frac{d}{dt} E[Z(t)]}_{0} \neq \underbrace{E\left[\frac{d}{dt}Z(t)\right]}_{m}$$ even though passing derivatives through expectations can be justified under "suitable assumptions" (which are not met in this counter-example). In the right-hand-side expectation above, we define $\frac{d}{dt} Z(t)$ to be 0 in the (probability 0) event when it does not exist.
A sufficient condition for passing derivatives through expectations:
Claim: Suppose $Z(t)$ is a random process with the following Lipschitz property: There is a random variable $M\geq 0$ such that $E[M]<\infty$ and $$ |Z(t+h)-Z(t)|\leq M|h| \quad \forall t, h \in \mathbb{R}$$ Further assume $E[|Z(0)|]<\infty$. For each $t \in \mathbb{R}$ define the event $$A_t = \{Z'(t) \mbox{ exists and is finite}\}$$ Suppose that $P[A_t]=1$ for all $t \in \mathbb{R}$. Then for all $t \in \mathbb{R}$ we have $E[|Z(t)|]<\infty$, $E[|Z'(t)||A_t]\leq E[M]$, and
\begin{align} \frac{d}{dt}E[Z(t)] = E[Z'(t)|A_t] \end{align}
Proof: Fix $t \in \mathbb{R}$. We have \begin{align} E[|Z(t)|] &\leq E[|Z(t)-Z(0)| + |Z(0)|]\\ &\leq E[M|t| + |Z(0)|]\\ &\leq |t|E[M] + E[|Z(0)|]\\ &<\infty \end{align} Further, the Lipschitz property implies that whenever $Z'(t)$ exists we must have $|Z'(t)|\leq M$, and so: $$ |Z'(t)| \leq M \quad \mbox{whenever $A_t$ holds} $$ Thus $$ E[|Z'(t)| | A_t] \leq M$$ Finally: \begin{align} \lim_{h\rightarrow 0}\frac{E[Z(t+h)]-E[Z(t)]}{h} &= \lim_{h\rightarrow 0}E\left[\frac{Z(t+h)-Z(t)}{h}\right]\\ &\overset{(a)}{=} \lim_{h\rightarrow 0}E\left[\frac{Z(t+h)-Z(t)}{h} | A_t\right]\\ &\overset{(b)}{=} E\left[\lim_{h\rightarrow 0} \frac{Z(t+h)-Z(t)}{h} | A_t\right]\\ &= E[Z'(t) | A_t] \end{align} where (a) holds because $P[A_t]=1$; (b) holds by the Lebesgue dominated convergence theorem, since $M$ acts as a dominating random variable: $$ \left|\frac{Z(t+h)-Z(t)}{h}\right| \leq M \quad \forall h\neq 0$$ $\Box$
Notes:
The counter-example $Z(t) = (U+mt) \mod 1$ does not meet the Lipshitz property for this claim to hold.
An interesting example where the conditions for the claim do hold is the following "ping-pong" example: $Z(0) = U \sim Uniform[0,1]$ and $Z(t)$ grows linearly with either slope $m$ or $-m$, bouncing around the unit interval and changing slopes only when we hit the boundaries $0$ and $1$. Here $Z(t)$ is not differentiable at times when it hits the boundary, but indeed for each $t \in \mathbb{R}$ we know $Z'(t)$ exists with probability 1.