Weak Uniqueness of solution of SDE

61 Views Asked by At

The following is an excercise where we try to prove Lemma 5.3.1. Oksendal "Stochastic Differential Equations"

Suppose that $b(t,x)$ and $\sigma(t,x)$ satisfy the hypotheses of the existence/(strong) uniqueness theorem. Suppose that $X_t,\hat{X}_t$ are adapted processes, both of which are continuous in mean square. Define

$$ \begin{aligned} Y_t&=\int_0^tb(s,X_s)ds+\int_0^t\sigma(s,X_s)dB_s\\ \hat{Y}_t&=\int_0^tb\left(s,\hat{X}_s\right)ds+\int_0^t\sigma\left(s,\hat{X}_s\right)d\hat{B}_s \end{aligned} $$

where $B_t$ and $\hat{B}_t$ are each Brownian motions.

  • a) Prove that the processes $Y_t$ and $\hat{Y}_t$ have the same distribution.

  • b) Prove that $Y_t$ is continuous in mean square.

  • c) Now consider the Ito diffusions

$$ \begin{aligned} dX_t&=b(t,X_t)dt+\sigma(t,X_t)dB_t\\ d\hat{X}_t&=b\left(t,\hat{X}_t\right)dt+\sigma\left(s,\hat{X}_t\right)d\hat{B}_t \end{aligned} $$

with initial conditions $X_0$ and $\hat{X}_0$ where $X_0$ and $\hat{X}_0$ have the same distribution. Let $X_t^k$ and $\hat{X}_t^k$ be the sequence of approximations to $X_t$ and $\hat{X}_t$ that come from Picard iteration. Use a) and b) to prove that the processes $X_t^k$ and $\hat{X}_t^k$ have the same distribution for all $k$ and that they are continuous in mean square for all $k$. This is easy given a) and b).

  • d) Conclude that the proceeses $X_t$ and $\hat{X}_t$ have the same distribution, i.e., we have weak uniqueness. Can we conclude that the solutions $X_t$ and $\hat{X}_t$ are continuous in mean square?

Here is the sketch of proof

enter image description here

How do I solve parts a) and b)?

1

There are 1 best solutions below

8
On

We will add a few more details, let me know if you need more.

enter image description here

Let $X_t$ and $Y_t$ be the strong solutions constructed from $\tilde{B}_t$ and $\hat{B}_t$, respectively, as above.

Here they simply mean that they obtain strong solutions for the following SDEs \begin{aligned} X_t&=\int_0^tb(s,X_s)ds+\int_0^t\sigma(s,X_s)d\tilde{B}_s\\ Y_t&=\int_0^tb\left(s,Y_s\right)ds+\int_0^t\sigma\left(s,Y_s\right)d\hat{B}_s, \end{aligned} which is possible from Theorem 5.2.1 since the coefficients are assumed to be Lipschitz.

Then the same uniqueness argument as above applies to show that $X_t = \tilde{X}_t$ and $Y_t = \hat{X}_t$ for all t, a.s.

By studying the mean-square as in the proof of Theorem 5.2.1

$$E|X_{t}-\tilde{X}_{t}|^{2}|,$$ they can again show that this is zero since they correspond to the same Brownian motion $\tilde{B}_s$ i.e. they both solve in law \begin{aligned} X_t&=\int_0^tb(s,X_s)ds+\int_0^t\sigma(s,X_s)d\tilde{B}_s\\ \tilde{X}_t&=\int_0^tb(s,\tilde{X}_s)ds+\int_0^t\sigma(s,\tilde{X}_s)d\tilde{B}_s. \end{aligned} The same is done for $Y,\hat{X}$.

Therefore it suffices to show that $X_t$ and $Y_t$ must be identical in law. We show this by proving by induction that if $X^{(k)}_t , Y^{(k)}_t$ are the processes in the Picard iteration...

The equality in distribution indeed follows by induction. We start from the base case $X^{(0)}_t=X_{0} \sim Y_{0} =Y^{(0)}_t$. Then we define

\begin{aligned} X_t^{(k+1)}&:=\int_0^tb(s,X_s^{(k)})ds+\int_0^t\sigma(s,X_s^{(k)})d\tilde{B}_s\\ Y_t^{(k+1)}&:=\int_0^tb\left(s,Y_s^{(k)}\right)ds+\int_0^t\sigma\left(s,Y_s^{(k)}\right)d\hat{B}_s, \end{aligned} but since we have $(X_s^{(k)})\stackrel{d}{=}(Y_s^{(k)})$ and $b,\sigma$ are measurable functions, we obtain $(X_s^{(k+1)})\stackrel{d}{=}(Y_s^{(k+1)})$.

Finally we discuss proving continuity in mean square using ideas from proof of 5.2.1

$$E[(Y_{s}-Y_{t})^{2}]=E\left[\left(\int_{s}^{t}bdr+\int_{s}^{t}\sigma dB_r\right)^{2}\right].$$

By Itô-isometry and zero quadratic variation for $dr,dB_{r}$, we are left with

$$=E\left[\left(\int_{s}^{t}bdr\right)^{2}\right]+\int_{s}^{t}E\left[(\sigma)^{2}\right] dr$$

using Cauchy-Schwartz

$$\leq (t-s)\int_{s}^{t}E[b^{2}]dr+\int_{s}^{t}E\left[(\sigma)^{2}\right] dr$$

using the linear growth and the inequality $(x+y)^{2}\leq 2x^{2}+2y^{2}$ we bound by

$$\leq c_{1}(t-s)\int_{s}^{t}E[1+Y_{r}^{2}] dr dr+c_{2}\int_{s}^{t}E\left[1+Y_{r}^{2}\right] dr$$

$$\leq b_{1}(t-s)+b_{2}\int_{s}^{t}E[Y_{r}^{2}] dr.$$

Finally, we use the bound 5.2.14 for L2-moments

$$E[Y_{r}^{2}]\leq \sum_{k=0}^{\infty}\frac{(AT)^{k}}{k!}+E[Y_{0}^{2}].$$