Let $X_i$ be a random sample drawing from a normal distribution of $\mu,\sigma^2$.
Calculate $E[(X_i-\mu)^3)$.
Solution:
$E[(X_i-\mu)^3] = E[(X_i-\mu)^2 (X_i-\mu)] = E[(X_i-\mu)](X_i-\mu)$.
I am wondering how come the second term above is equal to the third term?
Update:
$\theta_1 = E[X_i]$, $\theta_i = E(X_i - \theta_1)^j, j = 2,3,4$


$\newcommand{\E}{\operatorname{E}}$What is written in that book means that $\E((X_i-\mu)^3) = \E\big( (X_i-\mu)^2(X_i-\mu)\big).$
Stein's lemma then says $$ \E((X_i-\mu)^2(X_i-\mu)) = \E(g(X_i)(X_i-\mu)) = \sigma^2 \E(g'(X_i)) = \sigma^2\E(2(X_i-\mu)). $$