I am looking for a modern reference on the separation principle in stochastic optimal control, which states that under certain conditions the optimal control can be found by deterministic optimization, setting the noise terms in the dynamics to zero.
(There is also a separation principle for state observation, but I'm not interested in that, I assume I know the exact state).
The original research article references in https://en.wikipedia.org/wiki/Separation_principle are mostly outdated (I actually read a review from the 80's once, unfortunately I can't find it anymore, that said there are a lot of mistakes in the claims and proofs of the initial works) and some are difficult to read / written from an engineering perspective as opposed to a mathematical perspective.
Does anyone know of mathematical books about stochastic optimal control theory (preferably written by mathematicians for mathematicians) that covers the separation principles but also discusses exact (in numerical limits) methods for situations when no separation principle holds?
I'm not saying at all that mathematicians are the better optimal controllers, indeed, I found quite a few mathematical books about stochastic optimal control from a PDE perspective, that didn't even mention any separation principles.