I was talking to my teacher the other day about the OLS and linear regression model $Y = \beta X + \varepsilon$. If the regressors X are fixed numbers and I don't have the normal condition on the errors, then in order to have the asymptotic distribution of the OLS estimator (CLT) I need the condition (this also appears in Thomas Ferguson) \begin{equation} \frac{\max_i (X_i - \bar{X})^2}{n} \longrightarrow 0, n\rightarrow \infty. \end{equation}
But then in an e-mail I was asking about the case where $X$ is random, and I get the response
If the errors are not normal and $X$ has finite fourth moment, since $X$ is random we automatically have $$\frac{\max_i (X_i - \bar{X})^2}{n} \longrightarrow 0, n\rightarrow \infty.$$
I'd like to understand this response without asking in another e-mail. Why do we have this property based on the randomness of the $X$?