let us consider following picture

I really did not understand how he calculated
P(review us |spam), I am working on paper to clarify example,that why I draw tree diagram , for simplicity let us focus on first example, here is my tree diagram
so on the base of this tree diagram , how to calculate mixed probability ? please help me, the way he wrote formula is a bit confusing for me

I'm gonna address the picture you quoted and not the tree diagram on the whiteboard.
There are six words ordered as seen in the table on the right: {
password,review,send,us,your,account}.If a word is in the sentence under examination, then it's a $1$, otherwise it's a $0$. This way we can encode the sentence as a string of binaries of $0$s and $1$s.
So we have a new email with the sentence "review us now". The last word "now" is ignored because it is not in the dictionary.
The word
passwordis not in it so it's a $0$.The word
reviewis in so it's a $1$.sendis not in so $0$.usis in so $1$youris not in so $0$accountis not in so $0$.This becomes $P(\text{review us}~|~\text{spam} ) = P(\{0,1,0,1,0,0\}~|~\text{spam} )$
For ones we have $p_i$ and for the zeros we calculate as $1 - p_i$, where $p_i$ is the corresponding conditional probability listed in the "spam" column in the same table: $$\{p_1,\, p_2,\, p_3,\, p_4,\, p_5,\, p_6\} = \{ \frac24, \, \frac14, \, \frac34, \, \frac34, \, \frac34, \, \frac14\}~.$$
The first word in the dictionary (first row)
passwordis not in the sentence (encoded as $0$), and it has a conditional $p_1 = \frac24$. The contribution of this word to the likelihood is $1 - p_1 = (1 - \frac24)$The second word in the dictionary (second row)
reviewis in (thus encoded as $1$) and has a conditional $p_2 = \frac14$. Its contribution to the likelihood is $p_2 = (\frac14)$ where the parentheses are just to indicate the contributions from different words.The third word in the dictionary ($3$rd row)
sendis NOT in, encoded as $0$, with a conditional $p_3 = \frac34$, and its contribution to the likelihood is $1 - p_3 = (1 - \frac34)$So on and so forth.
The same procedure goes for $P(\text{review us}~|~\text{ham} ) = P(\{0,1,0,1,0,0\}~|~\text{ham} )$, with the same encoding (same dictionary) but calculated using the conditional probabilities of the other column (given that it is ham): $\{ \frac12, \, \frac22, \, \frac12, \, \frac12, \, \frac12, \, \frac02 \}$