This is a slide from a Markov model lecture. I am studying probability in the context of Computer Science, namely, Markov models for traditional AI learning.
I got stuck following the conditional probability equations in the linked slide, when trying to work out the simplifications myself.
I tried to condition the inner 3 joint events using $(x_t | x_{t-1}, e_{1:t})$ but I couldn’t get it to the form shown.
$x_t$ is the Hidden Markov model state and $e_t$ is the observed effect.
Q: How do they do this conditional expectation simplification step? (see red lines I marked in slide image).
Thanks!
P.S. my rep not high enough to post image directly, sorry
Working with the expression under the sum and using Markov property and rule of conditional probability: $$ P(x_{t-1}, x_t, e_{1:t}) = P(x_{t-1}, x_t, e_1,e_2,...,e_t)= P(x_{t-1}, e_{1:t-1}) P(x_t, e_t | x_{t-1}, e_{1:t-1}) = P(x_{t-1}, e_{1:t-1}) P(x_t|x_{t-1}, e_{1:t})P(e_t|x_t, x_{t-1},e_{1:t-1}) = P(x_{t-1}, e_{1:t-1})P(x_t|x_{t-1})P(e_t|x_t) $$