Optimal Stopping problem: Maximize the function $E^x[\int_0^\tau\theta e^{-pt} X_t dt + e^{-p\tau}X_\tau]$

294 Views Asked by At

This is the example 10.3.1 in Oksendal's book "Stochastic Diferential Equations" 5th edition. The problem is to find the supremum $g^*(s,x)$ (if exists) and the optimal stopping time $\tau$ such as: $$g^*(s,x) = \sup_\tau E^x[\int_0^\tau\theta e^{-pt} X_t dt + e^{-p\tau}X_\tau]$$

$X_t$ is a geometric brownian motion and $B_t$ is a 1D brownian motion: $$dX_t = \alpha X_t dt + \beta X_t dB_t$$ $$X_0 = x > 0$$

$E^x$ is the expectation only for those processes that starts at a value $x>0$ (the probability here is zero for other processes); $\alpha, \beta, p>0, \theta > 0$ are constant.

Here I'm re-doing the example:

Let $$f(s,x) = \theta e^{-ps} x$$ $$g(s,x) = e^{-ps} x$$ $$G(s,z,u) = e^{-ps} x + u$$

and the new process: $dZ_t = \begin{bmatrix} 1 \\ \alpha X_t \\ f(s,x) \\ \end{bmatrix} dt + \begin{bmatrix} 0 \\ \beta X_t \\ 0 \\ \end{bmatrix} dB_t$

The infinitesimal generator $(L_z G)(s,x,u)$ is: $$(L_z G)(s,x,u) = f(s,x)\frac{dG}{du} + \frac{dG}{ds} + \alpha x \frac{dG}{dx} + \beta^2 x^2\frac{d^2G}{2dx^2} = (-p + \alpha + \theta)e^{-ps}x$$

It is known while $(L_x G)(s,x,u) > 0$ there's no time $t$ that can be optimal. Let's define the sets: $$U = \{(s,x); (L_x G)(s,x,u) > 0\}$$ $$D = \{(s,x); G(s,x,u)<G^*(s,x,u)\}$$

It is known that $U \subset D$, and the first exit time $\tau$ from D is an optimal stopping time.

Because of the form of $G(s,x,u)$:

If $p \geq \alpha + \theta$ then $(L_x G)(s,x,u) < 0$, $U = \emptyset$ and $\tau = 0$, $G(s,x,u) = G^*(s,x,u) = e^{-ps} x + u$.

From here, I do not understand why the author says:

If $p \leq \alpha$ then $(L_x g)(s,x) > 0$, $U = \mathbb{R}$, $\tau = \infty$ and $G^*(s,x,u) = \infty$.

If $\alpha < p \leq \alpha + \theta$ then $(L_x g)(s,x) > 0$, $U = \mathbb{R}$, $\tau = \infty$ and $G^*(s,x,u) = \frac{\theta}{p-\alpha}e^{-ps} x + u$.

From what I see, if $p \leq \alpha + \theta$ then $(L_x g)(s,x) > 0$, $U = \mathbb{R}$, $\tau = \infty$, $G^*(s,x,u) = \infty$ and that's all.

Why is the case if $\alpha < p \leq \alpha + \theta$ different from $p \leq \alpha$? Why is not enough the case if $p \leq \alpha + \theta$?

Where does the function $G^*(s,x,u) = \frac{\theta}{p-\alpha}e^{-ps} x + u$ come from?

Edit: There's a mistake in Oksendal's book (both 5th and 6th edition): it says that the process $Z_t$ is: $$dZ_t = \begin{bmatrix} 1 \\ \alpha X_t \\ e^{-pt} X_t \\ \end{bmatrix} dt + \begin{bmatrix} 0 \\ \beta X_t \\ 0 \\ \end{bmatrix} dB_t$$ it should say that the process $Z_t$ is: $$dZ_t = \begin{bmatrix} 1 \\ \alpha X_t \\ \theta e^{-pt} X_t \\ \end{bmatrix} dt + \begin{bmatrix} 0 \\ \beta X_t \\ 0 \\ \end{bmatrix} dB_t$$

1

There are 1 best solutions below

0
On BEST ANSWER

Thanks to my teacher, I finally got it. I have to use Corollary 10.1.8 in Oksendal's book:

enter image description here This corollary shows a way to construct the optimal function $g^*(s,x)$.

In my case: $$Z_t = (s+t,X_t,u+\int^t_0\theta e^{-pr} X_r dr)$$ $$h_0 = G(s,t,u) = e^{-ps} x + u$$ $$h_1 = \sup_{t\geq0}E^{s+t,x,u}[G(Z^{s,x,u}_t)] = \sup_{t\geq0}E^x[\int^t_0f(r+s,X_r) dr + g(t+s,X_t)] = \sup_{t\geq0}E^x[\int^t_0\theta e^{-p(r+s)} X_r dr + e^{-p(t+s)}X_t+u]$$ $$=\sup_{t\geq0}[\int^t_0\theta e^{-p(r+s)} E^x[X_r] dr + e^{-p(t+s)}E^x[X_t]+u]$$ $$=\sup_{t\geq0}[\int^t_0\theta e^{-p(r+s)} xe^{\alpha r} dr + e^{-p(t+s)}xe^{\alpha t}+u]$$ $$=\sup_{t\geq0}[x\theta e^{-p s}\int^t_0 e^{(\alpha-p)r}x dr + e^{-p s}xe^{(\alpha-p)t}+u]$$ $$=\sup_{t\geq0}[x\theta e^{-p s}(\frac{e^{(\alpha-p)t}}{(\alpha-p)}- \frac{1}{(\alpha-p)})+ xe^{-p s}e^{(\alpha-p)t}+u]$$ Now we let $t\rightarrow \infty$ to check the supremum for the function: $$g^*(t,x) = x\theta e^{-p s} (\frac{e^{(\alpha-p)t}}{(\alpha-p)}- \frac{1}{(\alpha-p)})+ xe^{-p s}e^{(\alpha-p)t}+u$$

Then we have:

if $p\leq\alpha$:

The term in the exponentials is positive, so $\lim_{t\rightarrow \infty}g^*(t,x) = \infty$

I already checked the case when $p\geq\alpha+\theta$.

It remains the case when $\alpha\leq p\leq\alpha+\theta$. In this case the term in the exponentials is negative and the exponentials go to zero: $$\lim_{t\rightarrow \infty}g^*(t,x) = \frac{x\theta e^{-p s}}{p-\alpha}+u$$