Hamilton-Jacobi-Bellman, first order condition and derivative of value function

556 Views Asked by At

Let $i \in I = \{1, \ldots, n\}$ denote the number of controls. Let $x \in \mathbb{R}$ denote the state and let $u := (u_1, \ldots, u_n) \in \mathbb{R}^n $ denote the control vector. The state equation reads \begin{align} \dot x(t) = f(x(t), u(t)). \end{align} The value function is given by \begin{align} V(x_\tau) := \max_{u}\int^\infty_\tau{\sum_{i \in I}e^{-rt} F_i(x(t), u(t))dt} \end{align} where $F_i$ is some objective function and $r>0$. The Hamilton-Jacobi-Bellman equation reads \begin{align} rV(x) = \max_u\left\{\sum_{i \in I}{F_i(x, u)} + V'(x) f(x, u)\right\}. \label{eq:hjb_co} \end{align} The first order conditions read \begin{align} \sum_{i \in I}{\frac{\partial F_i(x, u)}{\partial u_j}} + V'(x)\frac{\partial f(x, u)}{\partial u_j} = 0 \quad \text{for } j = 1, \ldots, n \label{eq:foc_co} \end{align} I'd like to solve the foc for $V'(\cdot)$. Therefore, is the following statement true? \begin{align} V'(x) = -\sum_{i \in I}{\frac{\partial F_i(x, u)}{\partial u_1}}\left[\frac{\partial f(x, u)}{\partial u_1}\right]^{-1} = \ldots = -\sum_{i \in I}{\frac{\partial F_i(x, u)}{\partial u_n}}\left[\frac{\partial f(x, u)}{\partial u_n}\right]^{-1} \end{align}

  • Could you point me to some literature?
  • I'm probably not allowed to cancel out the $\partial u_j$ which does the trick, right?