The following statement is from a text on Statistical Estimation.

I am trying to figure out how the likelihood function was arrived at. By definition of likelihood,
$p_{\bf{X}|\theta}p(\bf{X}|\theta) = \displaystyle \prod_{l=0}^{L-1} p( X(l)|\theta) $
How did the author make the leap from here to writing the likelihood in terms of $N(l)$'s distribution ? (Note: $N(l) = X(l) - \theta$ ).
It is clear that $X(l) \sim N(\theta, \sigma^2_N)$ since it is the sum of $\theta$ and a mean zero Gaussian RV. If the $N(l)$ are independent then the $X(l)$ are as well, and then the joint distribution of the $X(l)$ is simply the product of the distributions of the individual distributions.