Dealing with SDE involving $\max\{X_t,0\}\mathrm{d}W_t$

190 Views Asked by At

I am considering the SDE $\mathrm{d}X_t=\ln(1+X_t^2)\mathrm{d}t+\max\{X_t,0\}\mathrm{d}W_t$, with the initial condition $X_0=a$. I am posed with the following daunting tasks:

  1. prove $\exists$ a unique strong solution $\forall$ $a$;

  2. show that $X_t>0$ (respectively $X_t<0$) almost surely $\forall$ $t\geq0$ if $a>0$ (respectively $a<0$), and that the solutions are deterministic in these cases as well;

  3. solve the SDE explicitly for $a=0$.

I am bamboozled by this SDE, particularly because I am unsure how to deal with the $\max\{X_t,0\}$ term since it is not differentiable (all I know is that it is convex). For the first part I know I should be looking at linear growth and the Lipschitz condition, but I haven’t been able to show it. Not to mention, $\ln(1+x^2)$ is not bounded. This leaves me unable to approach any of the above tasks. Any guidance and help is greatly appreciated!

1

There are 1 best solutions below

2
On
  1. We can use Theorem 5.2.1 in Oksendal (http://th.if.uj.edu.pl/~gudowska/dydaktyka/Oksendal.pdf) which gives us the conditions for existence and uniqueness of a solution. In your case $b(t,x)= \ln(1+x^2)$ and $\sigma(t,x) = \max\{x,0\}$. Now $b,\sigma$ are clearly measurable functions since they are continuous. Further $$\Big|\frac{\partial}{\partial x}b(t,x)\Big| = \Big|\frac{2x}{1+x^2}\Big|\leq 1$$ Thus $|b(t,x)-b(t,y)|\leq |x-y|$ using the Mean Value Theorem. You can also directly calculated that $|\sigma(t,x)-\sigma(t,y)|\leq |x-y|$ by considering the possible cases. This shows that the Lipschitz constraint is satisfied. At the same time $$|\sigma(t,x)|\leq |x| \leq 1+|x|$$ and $$|b(t,x)|=|\ln(1+x^2)|=|\ln(1+x^2) - \ln(1+0)|=|b(t,x)-b(t,0)|\leq |x-0|=|x|\leq 1+|x| $$ Since the initial condition is a constant function, then the initial condition is independent of $\mathcal{F}_\infty= \sigma(\{W_t:t\geq0\})$. Thus we can use Theorem 5.2.1 to know that the SDE you have given has a unique solution $X_t$ for any constant $a$. Further $t\to X_t$ is a.s. a continuous function by the same theorem.
  2. I need to think about this one, I will come and put some more work into it tomorrow but I will continue under the assumption that it is true....
  3. To prove this, we will use Lemma 8.1.4 from Oksendal to solve this. What the lemma states is that if $g$ is bounded and continuous, then $u(a)=\mathbb{E}^a[g(X_t)]$ is continuous in $a$ for any fixed $x$. Now in $1)$ we showed that $t\to X_t$ is a.s. continuous and in $2)$ we showed that $X_t$ was deterministic. Thus on any compact interval $[0,T]$, the function $t\to X_t$ is bounded. Suppose that $X_t\in [0,M]$ for all $t\in[0,T]$, then we may find a continuous function $g_M$ such that $g_M(y)=y$ for $y\in[0,M]$ and is bounded outside of this interval. Thus $u(a)=\mathbb{E}^a[g_M(X_t)] =\mathbb{E}^a[X_t]=X_t^a $ is a continuous function in $a$ for all $a$. Here $X_t^a$ just indicates that $X_0=a$ and we again used that the process was deterministic. Now in $2)$ we showed that if $a < 0$, then $X_t < 0$ for all $t$. Thus $X_t^a$ must satisfy the differential equation $$dX_t = \ln(1+X^2)dt $$ This is equivalent to the ODE $$\frac{dx}{dt} = \ln(1+x^2) $$ with initial condition $x(0)=a$. If we can solve this ODE, then take the limit at $a\to 0^-$, then we should get the solution at $a=0$.