I have the following math problem:
A plane flies in two dimensions.
$x(t)$ is the movement relative to the ground, $v_x = \frac{dx}{dt}$.
$z(t)$ is the altitude, $v_z = \frac{dz}{dt}$.
$\phi(t)$ is the pitch angle, $\omega = \frac{d\phi}{dt}$
The initial state is some known $x_0, z_0, v_{x0}, v_{z0}, \phi_0, \omega_0$. The pilot can at any time apply a thrust $T(t)$ in the body direction or a pitch torque $P(t)$, which are limited to intervals
$0<T<T_{max}$
$-P_{max} < P < P_{max}$
There is a differential equation with known f so that
$\frac{d (x, z, v_x, v_z, \phi, \omega)}{dt} = f(x, z, v_x, v_z, \phi, \omega, T, P)$.
Find $T(t)$ and $P(t)$ (or alternatively $P(x)$, $T(x)$) so that the plane reaches a point $x_1, z_1$ as fast as possible.
I'm mostly looking for some inspiration regarding possible approaches to a numerical solution. The major problem I encountered is that derivatives $\frac{d x_e}{dT_0}$ etc... of late time positions $x_e, z_e$ with respect to early time thrust $T_0$ and pitch $P_0$ decisions increase exponentially with the traveling distance, to the point where any gradient based minimization becomes rapidly unstable.