Finite Difference for Hamilton-Jacobi-Bellman without boundary conditions

702 Views Asked by At

Let $t\in\mathbb{R}_+$ denote time, $x \in X$ is the state and $u \in U$ the control. The objective function is $F:X \times U \to\mathbb{R}$ and $f:X \times U \to\mathbb{R}$ is the law of motion for the state, i.e. \begin{align} \dot{x}(t) = f(x(t),u(t)) \end{align} Define the value function $v:X\to\mathbb{R}$ by \begin{align} v(x_0):=\max_{\{u(t)\}_{t \geq 0}}\left[\int^\infty_0{e^{-\rho t}F(x(t),u(t))dt}\right] \end{align} where $x_0:=x(0)$ is a given initial condition and $\rho\in\mathbb{R}_{++}$ is a parameter (time preference rate). The Hamilton-Jacobi-Bellman equation in current time reads \begin{align} \rho v(x) = \max_u[F(x,u) + v'(x)f(x,u)] \end{align} which is an ODE. If one can solve for $v(\cdot)$ we can recover the optimal path
\begin{align} \{u^*(t):t\in\mathbb{R}_+\} = \arg\max_{\{u(t)\}_{t \geq 0}}\left[\int^\infty_0{e^{-\rho t}F(x(t),u(t))dt}\right] \end{align} I try to solve the HJB equation by value function iteration. That is guessing an initial value function and then iterate until it converges to the true value. I update via an implicit method a la \begin{align} \frac{v_{j+1}(x_n)-v_j(x_n)}{\Delta} + \rho v_j(x_n) = F(x_n,u^*_n) + v'_{j+1}(x_n)f(x_n,u^*_n) \end{align} where $\Delta$ is a scaling paramter, $j=0,1,2,\ldots$ the number of iterations and $x_n$ an element of the grid of the state space \begin{align} \mathbf{x} := [x_n]_{n=1}^N\in X^N \end{align} and $u_n$ is the solution of \begin{align} u^*_n = \arg\max_u [F(x_n,u_n) + v'_{j+1}(x_n)f(x_n,u_n)] \end{align}

To get an expression for $v'(\cdot)$ we shall define slope coefficients by \begin{align} v'(x_n) :\approx \begin{cases} \displaystyle\frac{v(x_{n+1}) - v(x_n)}{\Delta x}\quad&\text{forward difference}\\[2mm] \displaystyle\frac{v(x_{n+1}) - v(x_{n-1})}{2\Delta x}\quad&\text{central difference}\\[2mm] \displaystyle\frac{v(x_n) - v(x_{n-1})}{\Delta x}\quad&\text{backward difference}\\ \end{cases} \end{align}

where $\Delta x$ denotes the equidistance between the elements of $\mathbf{x}$, i.e. \begin{align} \Delta x:=x_{n+1} - x_n = x_n - x_{n-1}~\forall n \end{align}

Problem

It's known that for forward and backward difference one derivative is missing and for central difference two. We can deal with this issue by a boundary condition on either side. However, consider that there is no boundary condition given. I can still approx $v'(x)$ by a combination of forward, backward and central difference. For instance (case 1) \begin{align} v'(x_n) \approx \begin{cases} \displaystyle\frac{v(x_{n+1}) - v(x_n)}{\Delta x}\quad&n=1\\[2mm] \displaystyle\frac{v(x_{n+1}) - v(x_{n-1})}{2\Delta x}\quad&1 < n <N\\[2mm] \displaystyle\frac{v(x_n) - v(x_{n-1})}{\Delta x}\quad&n=N \end{cases} \end{align}

or (case 2)
\begin{align} v'(x_n) \approx \begin{cases} \displaystyle\frac{v(x_{n+1}) - v(x_n)}{\Delta x}\quad&1\leq n <N\\[2mm] \displaystyle\frac{v(x_n) - v(x_{n-1})}{\Delta x}\quad&n=N \end{cases} \end{align} Note that in the second case $v'(x_{N-1})=v'(x_N)$.

  • I was wondering what's common practice in the math profession.

I'm asking because I get significant different results for the different cases (irrespective of $N$).

PS: Note that I'm neither a mathematician nor a native english speaker. So I apologize for misunderstandigs.

1

There are 1 best solutions below

1
On

If you have no further hypotheses, there is no good numerical method to approximate the derivative.

Take for example $$f_n(x)=\frac1n\sin\left(n^2x\right)$$ so $$f'_n(x)=n\sin(n^2x)$$

Note that if you approximate $f$ to $0$ the error is $O(1/n)$, but if you approximate $f'$ to $0'=0$ the error is not even bounded.