optimal control problem - value of Hamiltonian for Mayer or Lagrange formulation

221 Views Asked by At

I am reviewing the application of Pontryagin's principle (in its minimum formulation) to minimum-time problems. However I got confused about the constant value of the Hamiltonian for this class of problems: Imagine we have a system subject to

$$\dot{x} = ax+bu$$

and that we want to achieve $x(t_F) = 0$ in minumim time.

The Bolza problem reduced to its Lagrange form gives

$$ J = \Phi(t,x) + \int_{t_0}^{t_F}\Psi(x,u) dt$$

with $$\Phi(t,x) = 0,$$ $$\Psi(x,u) = 1$$

so the Hamiltonian to be minimized would be

$$ H = 1+ \lambda (ax+bu)$$

By applying the condition $H(t_F) + \frac{d\Phi}{dt_F} = 0$ we get $$H(t_F) = 0$$

Since we know that $\frac{dH}{dt} = \frac{\partial H}{\partial t} = 0$ it follows that H must be constant and equal to $0$ along the entire trajectory.

If we use the Mayer formulation we will have $\Phi(t,x) = t_F$, $\Psi(x,u) = 0$.

By repeating the procedure we get $H(t_F) + \frac{d\Phi}{dt_F} = H(t_F) +1 = 0$

from which it follows that $H(t_F) = -1$. Also in this case $H$ must be constant.

Can someone explain me why this (apparent?) contradiction? Did I understand anything wrong? Or the Hamiltonian is constant, but the constant is arbitrary, and depends on the formulation?

Thanks!!


1

There are 1 best solutions below

0
On

Pontryagin’s maximum principle now states: Let u(t), O< t< T, be an admissible control which transfers the system from the state x0 to the state x1, and let x(t) be the corresponding trajectory, so that x(0)=x0 and x(T)=x1. In order that u(t) and x(t) be time-optimal it is necessary that there exist a non-zero, continuous vector function p(t) corresponding to u(t) and x(t) such that :

  1. For all t, 0 < t< T, the function H(p(t), x(t),u) of the variable u attains its maximum at the point u=u(t)

i.e. H[p(t), x(t), u(t)] = M[p(t), x(t)]

  1. At the terminal time T the relation

M[p(T), x(T)] > 0

is satisfied.