"Conditional distribution" of Brownian sample paths

233 Views Asked by At

I would like to consider the "conditional distribution" of the Brownian sample paths conditional on certain sample path functionals, in a similar way that one considers the Brownian bridge. For example, consider the functional $\phi: C[0,1] \rightarrow \mathbb{R}$ defined by, say

$$ \phi( W(\cdot) ) = \left( \int_0^1 W dW \right)^2, $$ or $$ \phi( W(\cdot) ) = \int_0^1 W^2_t dt, $$ etc., what is the process obtained by conditioning $W$ on $\phi$? Are there any results of this type?

1

There are 1 best solutions below

6
On

I don't know of any literature on the subject of conditioning Brownian motion in this way, however, I believe that when the functional $\phi$ is nice enough (in the sense of Malliavin calculus), it is possible to write the conditioned process as the solution of some SDE: $dX_t = dB_t+G(t,X_{\cdot})dt$. Here $G$ is some predictable function from $\Bbb R_+ \times C_0[0,1] \to \Bbb R$ which depends on $\phi$ and also on the value we condition on.

I do not know the precise/rigorous conditions on $\phi$ which make this possible and I do not intend to pursue that here. Instead, I will give some intuitionist, "physics-level" derivation of how to compute the drift $G$ from the functional $\phi$.

So, suppose we want to condition on $\phi(W)=C$. Assume that the law of $\phi(W)$ has a continuous density $p_{\phi}(x)$ with respect to Lebesgue measure which is strictly positive near $C$ (there are conditions to ensure this using Malliavin Calculus, e.g. $C$ should be a regular value of $\phi$, and the Malliavin matrix of $\phi$ should be invertible and satisfy some regularity conditions in some neighborhood of $\phi^{-1}(\{C\})$). Then define $$F(\phi,C) := \frac1{p_{\phi}(C)}\partial_t|_{t=0} \big(\partial_x|_{x=C}\Bbb E[W_t\cdot 1_{\{\phi(W) < x\}}]\big),\tag{1}$$ which (as may be verified using regular conditioning) will just be the drift felt by the conditional process at $t=0$. Furthermore, if $t \ge 0$ and $f,g \in C[0,1]$ then we define $\psi(t,f,g) := \phi\big(f(t^{-1}\cdot)*g((1-t)^{-1}\cdot)\big)$, where $f(t^{-1}\cdot)\in C[0,t]$ is the map $u \mapsto f(t^{-1}u)$ (and similarly for $g$); and moreover "$*$" denotes concatenation of paths: more specifically, if $\gamma \in C_0[0,t]$ and $\mu \in C_0[0,1-t]$ then $\gamma * \mu\;(u) = \gamma(u)$ for $u \le t$ and $\gamma * \mu\; (u) = \gamma(t) + \mu(u-t)$ for $u \in [t,1]$. Finally, we are ready to define the drift function $$G(t,\omega) := \frac{F\big(\;\psi(t,\omega(t\cdot),(1-t)^{1/2}\cdot)\;, C\big)}{1-t}.\tag{2}$$ The way to interpret this last expression is to note that for fixed $\omega \in C_0[0,1]$ and $t \ge 0$, the map $g \mapsto \psi\bigg(t\;,u\mapsto \omega(tu)\;, v\mapsto (1-t)^{1/2}g(v)\bigg)$ is a functional on $C[0,1]$, hence we can plug it into $F(\cdot,C)$.

Now, with the way we have defined $G$, we claim that the solution of $dX_t = dB_t + G(t,X_{\cdot})dt$ is essentially Brownian motion conditioned on $\phi=C$. The intuitive reason for this (which is easy to visualize) is that at each infinitesimal step, $X_t$ feels a drift in the direction which allows the remainder of the path to be distributed according to a Brownian motion conditional on the constraint that if we concatenate this Brownian path with the history of $X$ so far, then applying $\phi$ to the resultant path gives $C$. Then one may check (perhaps tediously) that mathematically formulating this whole procedure gives the above SDE. Unfortunately it remains to be shown that the solution $X$ actually exists and that its law is the same as that obtained by other conditioning procedures, like regular conditioning / disintegration. However I did some computations which suggest that this is actually true, and in fact the resulting law even appears to be weakly continuous in $C$ if $\phi$ is nice.

It is interesting to note that if we apply these heuristics to $\phi(\omega) = \omega(1)$ then we recover the SDE of Brownian bridge: $dX_t = dB_t + \frac{C-X_t}{1-t}dt$. For another example, lets say that $\phi(\omega) = p(\omega(1))$ where $p$ is a quadratic polynomial with two real roots $r,s$. Then we set $f(a,b) = \frac{ae^{-a^2/2}+be^{-b^2/2}}{e^{-a^2/2}+e^{-b^2/2}}$. Then, one way to understand the conditional process (with $\phi(W)=0$) is in terms of a Brownian bridge (see my comment), but another way is as the solution of the SDE $dX_t = dB_t +\frac{f(r-X_t,s-X_t)}{1-t}dt$. With some modifiction, this SDE formulation seems to make sense even for problems such as conditioning $W$ to stay positive (e.g., take $\phi = \inf$ and let $C=0$).

Another remark is that it seems from this description that the conditioned process is necessarily absolutely continuous wrt Brownian motion, but actually this is not true because the drift term $G$ can have small blow-ups along the way which can make it true that $\int_0^t G(s,X_{\cdot})^2ds = +\infty$ (think about $\phi(\omega) = \omega(1/2)$, which gives a Brownian bridge up to time $1/2$, but then just evolves as Brownian motion). A final remark is that in practice it might be difficult to compute $F$ and $G$ (the quantities appearing in $(1)$ and $(2)$). To compute $F$, in principle one only needs to know how to compute $p_{\phi}$ and $\Bbb E[W_t|\phi(W)]$, and in certain cases (like Brownian bridge) this is quite explicit, however in other cases it might be very hard. Computing $G$ can be even harder, however in certain cases it can be computed easily from $F$, even if we cannot explicitly compute $F$ itself (this is the case for your second example where $\phi(f) = \int f^2$, because this functional satisfies a nice additivity property with respect to concatenation of paths).