The general formulation of a stochastic control problem with finite horizon is
$$V(x)=\sup_{u\in\mathbb{U}}E\left(G(X_{T}^{u}) + \int_{0}^{T}F(s,X_{s}^{u}, u_{s})ds|X_{0}^{u}=c\right)$$
where $X_{s}^{u}$ is the solution of the SDE $dX_{t}^{u}=\mu(t,X_{t}^{u},u_{t})dt + \sigma(t,X_{t}^{u},u_{t})dW_{t}$
I am trying to understand an optimal control problem that wants to maximise the survival probability $\delta^{b}(x)=1-P(\tau^{b}<\infty|X_{0}^{b}=x)$ where $\tau^{b}=\inf \lbrace t\geq 0 : X_{t}^{b}<0\rbrace$. In this case the value function would be $\delta(x)=\sup_{b\in \mathbb{U}}\delta^{b}(x)$
How can I write $\delta(x)$ in terms of the general formulation? What are the values of $F$ and $G$?