In the book Pattern Recognition and Machine Learning by C.M Bishop at one place we are trying to find the covariance $\mathbf\Sigma$ of a multivariate Gaussian distribution given by -
$$\frac1{({2\pi})^{D/2}} \frac1{{|\mathbf\Sigma|}^{1/2}} \exp\{ -\tfrac12(\mathbf x-\boldsymbol\mu)^T \boldsymbol\Sigma^{-1} (\mathbf x-\boldsymbol\mu) \}$$
There we are attempting to find $E[\mathbf x \mathbf x^T]$ and we come across an integral of the form - $\int {\mathbf x \mathbf x^T d\mathbf x}$ to be evaluated over all $\Bbb R^n$.
What does it mean to have the matrix $\mathbf x \mathbf x^T$ in the integral? How is the integral defined in terms of evaluating this? Is this actually a volume integral done over all the components of the matrix?
The expected value of a random vector or random matrix is defined componentwise. That is, for ${\bf X}$ a $\mathbb{R}^n$-valued random vector, the $i$th component of $E[{\bf X}]\in \mathbb{R}^n$ is $E[ X_i]$. Likewise the $(i,j)$ entry of $E[{\bf X}{\bf X}']\in\mathbb{R}^{n\times n}$ is $E[ X_iX_j].$