How does a probability density function evolve with time?

1.7k Views Asked by At

Let $f(x,t)$ be a probability density function of stochastic process $X_t$, where $X_t$ evolves over time deterministically according to the following ODE

$$\dot x = g(x,t)$$

Then, how does $f(x)$ change with time? How to find $\frac{df}{dt}$?

For example, say $X_t$ is the height of a tree at time $t$. Tree heights are distributed according to $f(x,t)$ at a given time. Each tree grows according to $\frac{dX_t}{dt}=g(X_t,t)$. Then what is $\frac{df}{dt}$?

2

There are 2 best solutions below

3
On

By definition, the probability mass in the interval $[x, x + \mathrm d x]$ at time $t$ is equal to $f (x,t) \,\mathrm d x$. Under the influence of the differential equation $\dot x = g (x,t)$, the ends of the interval flow to

$$\begin{aligned} x &\mapsto x + g (x,t) \,\mathrm d t\\ x + \mathrm d x &\mapsto x + \mathrm d x + g (x + \mathrm d x,t) \,\mathrm d t = x + \mathrm d x + g (x,t) \,\mathrm d t + \partial_x g (x,t) \,\mathrm d x \,\mathrm d t\end{aligned}$$

Hence, the interval $[x, x + \mathrm d x]$ is mapped to an interval of width

$$\mathrm d x + \partial_x g (x,t) \,\mathrm d x \,\mathrm d t = \left( 1 + \partial_x g (x,t) \,\mathrm d t \right) \mathrm d x$$

Since probability mass is conserved,

$$f (x,t) \,\mathrm d x = f (x + g (x,t) \,\mathrm d t, t + \mathrm d t) \, \left( 1 + \partial_x g (x,t) \,\mathrm d t \right) \mathrm d x$$

Dividing both sides by $\mathrm d x$, we obtain

$$\begin{aligned} f (x,t) &= f (x + g (x,t) \,\mathrm d t, t + \mathrm d t) \, \left( 1 + \partial_x g (x,t) \,\mathrm d t \right)\\ &= \left( f (x,t) + \partial_x f (x,t) \, g (x,t) \,\mathrm d t + \partial_t f (x,t) \,\mathrm d t \right) \, \left( 1 + \partial_x g (x,t) \,\mathrm d t \right)\\ &= f (x,t) + \partial_x f (x,t) \, g (x,t) \,\mathrm d t + \partial_t f (x,t) \,\mathrm d t + f (x,t) \, \partial_x g (x,t) \,\mathrm d t \end{aligned}$$

where the two terms multiplying $\left( \mathrm d t \right)^2 := 0$ were discarded. Thus,

$$\partial_x \, f (x,t) \, g (x,t) + \partial_t \, f (x,t) + f (x,t) \, \partial_x g (x,t) = 0$$

or, using the derivative of the product, we then obtain the following continuity equation (PDE)

$$\color{blue}{\partial_t \, f + \partial_x \left( f \cdot g\right) = 0}$$

This is an extremely dirty derivation. It would be nice to have a rigorous one.

2
On

This is an extremely dirty derivation. It would be nice to have a rigorous one.

Here is one. For every $t$ and every regular function $u$ going to zero at infinity, by definition of the PDF of $X_t$,

$$E(u(X_t)) = \int_\mathbb R u(x)f(x,t)dx$$

Differentiating under the expectation sign, this yields

$$\int_\mathbb R u(x)\partial_tf(x,t)dx=\frac{d}{dt}E(u(X_t)) = E(u'(X_t)g(X_t,t)) = \int_\mathbb R u'(x)g(x,t)f(x,t)dx$$

Now, integrating by parts and using the fact that $u$ is zero at infinity, the rightmost term is

$$\int_\mathbb R u'(x)g(x,t)f(x,t)dx = -\int_\mathbb R u(x)\partial_x(g(x,t)f(x,t))dx$$

To sum up, the identity

$$\int_\mathbb R u(x) \partial_t f(x,t) dx = -\int_\mathbb R u(x)\partial_x(g(x,t)f(x,t))dx$$

should hold for every admissible test function $u$ hence, by identification, $$\partial_t f(x,t) = -\partial_x(g(x,t)f(x,t))$$