I need your help to clarify some concepts. The OLS problem can be formulated as a linear combination of a set of vectors $X=[I_n, X_1, X_2, \dots, X_{k}]$, where $I_n$ is vector of ones for the intercept, with a set of constants $[\beta_0, \beta_1, \beta_2, \dots, \beta_k]'$, with a vector $y$ of response variable, and a vector of disturbances $u \sim {\rm Normal}(0,\sigma^2)$. $$ y=X\beta'+u$$ Considering that each $n$ element of $X_k$ are random draws of theoretical $k$ probability distributions. Then the $n$ entries in each $X_k$ will follow the expectation of $E[X_k]$. Such that for each $k$ regressor: $$X_k=\begin{bmatrix}X_{1k} \\ X_{2k} \\ \vdots \\ X_{nk} \end{bmatrix}=\begin{bmatrix} E[X_k] \\ E[X_k] \\ \vdots \\ E[X_k] \end{bmatrix}$$ Now, in a random vector each entry is a "random variable", such that for each $k$ random vectors, each element is represented by its corresponding $E[X_{nk}]$ expected value.
$$X_k=\begin{bmatrix}X_{1k} \\ X_{2k} \\ \vdots \\ X_{nk} \end{bmatrix}=\begin{bmatrix} E[X_{1k}] \\ E[X_{2k}] \\ \vdots \\ E[X_{nk}] \end{bmatrix}$$
Can the regressors in OLS be consider random vectors?
Sure. Take for instance the Autoregression process, $$ X_t = \beta_0 + \beta_1 X_{t-1} + \epsilon_t. $$ Moreover, the coefficients themselves may be considered as random variables too (see random effect models).