The classical multiple regression model for $n$ iid observations is $$ \mathbf{Y} = \mathbf{X}\beta + \epsilon, $$ where $$ \mathbf{X} = \begin{bmatrix} \mathbf{X}_{1}^T \\ \mathbf{X}_{2}^T \\ \vdots \\ \mathbf{X}_{n}^T \\ \end{bmatrix} = \begin{bmatrix} 1 & X_{11} & X_{12} & \dots & X_{1p} \\ 1 & X_{21} & X_{22} & \dots & X_{2p} \\ \vdots & \vdots & \vdots & \ddots \vdots \\ 1 & X_{n1} & X_{n2} & \dots & X_{np} \end{bmatrix}. $$
Assuming that the matrix $E[\mathbf{X}_i\mathbf{X}_i^T]$ exists, the weak law of large numbers gives that $$ \frac{1}{n} \sum_{i=1}^n \mathbf{X}_i \mathbf{X}_i^T \stackrel{p}{\to} E[\mathbf{X}_i\mathbf{X}_i^T]. $$
So we have convergence in probability, but I am more interested in convergence in expectation (and also the expectation on finite samples).
Is $\frac{1}{n} \sum_{i=1}^n \mathbf{X}_i \mathbf{X}_i^T$ a biased estimator of $E[\mathbf{X}_i\mathbf{X}_i^T]$?
If it is biased for finite $n$, what about the case when $n \to \infty$, i.e., is $\lim_{n \to \infty} E[\frac{1}{n} \sum_{i=1}^n \mathbf{X}_i \mathbf{X}_i^T] \to E[\mathbf{X}_i\mathbf{X}_i^T]$? If we assume uniform integrability (UI) of the elements of the matrix $\frac{1}{n} \sum_{i=1}^n \mathbf{X}_i \mathbf{X}_i^T$ then we get convergence in expectation. But do we need to assume $\frac{1}{n} \sum_{i=1}^n \mathbf{X}_i \mathbf{X}_i^T$ is UI or is it actually already UI without any assumptions?