HJB
Suppose you have a stochastic process $\left( X^u_t \right)_{t \in[0,T]}$ controlled by $u(t, x)$.
We can define the optimal value function $$ V(t, x) = \min_u \mathbb{E} \left[ \int_t^T C(s, X_s, u(s, X_s)) \mathrm{d} t + D(X_T) \big| X_t = x \right] $$
The optimal value function satisfies the HJB equation: $$\min_u \mathcal{A} V(t, x)+ C(t, x, u) = 0$$ where $\mathcal{A}$ is the infinitesimal generator of $X_t$.
Feynman-Kac
Consider a fixed control $u$.
We can define the value function $$ V(t, x) = \mathbb{E} \left[ \int_t^T C(s, X_s, u(s, X_s)) \mathrm{d} t + D(X_T) \big| X_t = x \right] $$
It satisfies the Feynman-Kac equation $$ \mathcal{A} V(t, x)+ C(t, x, u(t,x)) = 0$$
From this, Feynman-Kac is a special case of HJB, when you have no control.
Is it possible to go the other way, i.e. give a derivation of HJB from Feynman-Kac?