In my previous question I asked whether a set of asymptotically normal random variables $\{X_n\}_{n \ge 1}$ are uniformly integrable. In the accepted answer the poster showed that this does not hold in general.
Now what about a specific case. Consider the standard ordinary least squares (OLS) regression coefficients $\hat{\beta}_n = \beta + (X_n^T X_n)^{-1} X_n^T \varepsilon$ where I have used the subscript $n$ to indicate that these are the OLS coefficients for a regression model with $n$ observations. Is $\{\hat{\beta}_n\}_{n \ge 1}$ uniformly integrable? Of course, $\hat{\beta}_n$ is a random vector, so I am asking whether the elements of this vector, which are all asymptotically normal, are uniformly integrable.
Note that the $X_n$ above are stochastic regressors, not fixed.
Edit: Attempt for the case of single regressor with no intercept Here is an attempt for the case of a single regressor with no intercept, i.e., the regression model is $$ Y = X \beta + \varepsilon, $$ where we have $n$ i.i.d observations $Y = [Y_1,Y_2,\dots,Y_n]^T$ and regressors $X = [X_1,X_2,\dots,X_n]^T$, and $\varepsilon = [\varepsilon_1,\varepsilon_2,\dots,\varepsilon_n]^T$. The errors are mutually independent with $E[\varepsilon_i|X] = 0$ and they all have the same finite variance $\sigma^2$.
Let $\hat \beta_n$ denote the OLS estimate for $\beta$ where the subscript $n$ indicates the number of observations in the regression model.
Note that because we only have a single regressor that $(X^T X)^{-1}$ reduces from a matix to a scalar: $$ (X^T X)^{-1} = \bigg(\sum_{i=1}^N X_i^2\bigg)^{-1}. $$
From what I understand, to show that $\{\hat \beta_n\}_{n \ge 1}$ is uniformly integrable, it is enough to show that there exists a $\delta >0$ such that $\sup_n E[|\hat\beta_n|^{1 + \delta}] < \infty$.
First, I think that wlog we can consider $\hat\beta_n - \beta$ instead of $\hat\beta_n$ in the above expectation. Let's take $\delta = 1$ and define $A = (X^TX)^{-1}$. Then
\begin{align} E[|\hat\beta_n - \beta|^2] &= E[|A\varepsilon|^2] \\ &= E[A\varepsilon \varepsilon A^T] \\ &= E[A E[\varepsilon \varepsilon|X]A^T] \\ &= \sigma^2 E[AA^T] \\ &= \sigma^2 E[(X^TX)^{-1}]. \end{align}
It is commonly assumed that $\text{plim}_{n\to \infty} (X^TX/n) \stackrel{p}{\to} Q$ where $Q$ is a positive constant (usually a positive definite matrix, but since we only have a single regressor its a scalar in our case). And thus $$ \text{plim}_{n\to \infty} (X^TX/n)^{-1} \stackrel{p}{\to} Q^{-1}. $$
Write $(X^TX)^{-1} = n^{-1}(X^TX/n)^{-1}$ and substitute into the expectation above to get \begin{align} E[|\hat\beta_n - \beta|^2] \ & = \sigma^2 E[(X^TX)^{-1}] \\ & = \sigma^2 n^{-1} E[(X^TX/n)^{-1}]. \end{align} Now we know $(X^TX/n)^{-1} \stackrel{p}{\to} Q^{-1}$ and if can convert this convergence in probability into convergence in expectation we will have shown that $E[|\hat\beta_n - \beta|^2] < \infty$ (in fact it will go to zero due to the $n^{-1}$ coefficient), and thus the OLS regression coefficients are uniformly integrable.
So it seems the problem of uniform integrability of the OLS regression coefficients reduces to the uniform integrability of $(X^TX/n)^{-1}$.
So here are my questions now:
- Does my analysis seem correct?
- Is $(X^TX/n)^{-1}$ uniformly integrable?
- If $(X^TX/n)^{-1}$ is not uniformly integrable by default is it reasonable to assume it is uniformly integrable? How strong an assumption would this be?
P.S. We have $(X^TX/n)^{-1} = \bigg(\frac{1}{n} \sum_{i=1}^n X_i^2\bigg)^{-1}$. So is the inverse of the second sample moment uniformly integrable?
Yes, OLS regression coefficients are uniformly integrable. I refer you to this article that proves it along the way: https://arxiv.org/pdf/1511.02962.pdf