$$ y_{i}=\beta_{0}+x_{i}^{\prime} \beta_{1}+\varepsilon_{i} $$
with $x_{i}=\left(x_{i 2} \cdots x_{i K}\right)^{\prime}$. We assume
- $\epsilon_i $ is independent and identically distributed (iid) with mean 0 variance $\sigma^2$
- $x_i$ are i.i.d with $\left(\mu_{x}, \Sigma_{x}\right)$ non-singular and finite
- $\mathrm{E}\left(\varepsilon_{i} x_{i}\right)=0$
By partial regression the OLS estimator of $\beta_1$ $$ \hat{\beta}_{1 n}=\left(X^{\prime} M_{1} X\right)^{-1} X^{\prime} M_{1} y= $$ $$ =\left(\sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right)\left(x_{i}-\bar{x}_{n}\right)^{\prime}\right)^{-1}\left(\sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right) y_{i}\right) $$ with $M_{1}=I-\frac{1}{n} \iota \iota^{\prime}$ and $\bar{x}_{n}$ the vector of sample means of the non-constant independent variables. We use subscript n to indicate that the OLS estimator uses a sample of size n.
Upon substitution of $y_{i}=\beta_{0}+x_{i}^{\prime} \beta_{1}+\varepsilon_{i}$ we have $$ \hat{\beta}_{1 n}-\beta_{1}=\left(\sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right)\left(x_{i}-\bar{x}_{n}\right)^{\prime}\right)^{-1}\left(\sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right) \varepsilon_{i}\right) $$ Now $$ \frac{1}{n} \sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right)\left(x_{i}-\bar{x}_{n}\right)^{\prime}=\frac{1}{n} \sum_{i=1}^{n}\left(x_{i}-\mu_{x}\right)\left(x_{i}-\mu_{x}\right)^{\prime}-\left(\bar{x}_{n}-\mu_{x}\right)\left(\bar{x}_{n}-\mu_{x}\right)^{\prime} $$ with the first term on the rhs converging in probability to $\mathrm{E}\left[\left(x_{i}-\mu_{x}\right)\left(x_{i}-\mu_{x}\right)^{\prime}\right]= \Sigma_{x}<\infty$ The second term on the rhs converges in probability to 0 by the WLLN (that implies that $\bar{x}_{n}-\mu_{x} \stackrel{p}{\rightarrow} 0$ by joint convergence) and continuous mapping. Therefore the limit in probability is $\Sigma_{x}$.
Q: I can follow this loosely until the very end. I'm assuming he's taking the expectation, given it's an average? But how is the left hand side of the last equation equal to the right hand side with the population parameters? How is $\frac{1}{n} \sum_{i=1}^{n}\left(x_{i}-\bar{x}_{n}\right)\left(x_{i}-\bar{x}_{n}\right)^{\prime}$ equal to that right hand side?
Here is the proof for the univariate case, but you can use the same trick by using vector $\mu_x$ and multivariate $x_i$ instead of scalars
\begin{align} \sum ( x_i - \bar{x}_n)^2 &= \sum (x_i - \mu + \mu - \bar{x}_n) ^ 2\\ &= \sum(x_i - \mu)^2 - 2\sum (x_i - \mu)(\bar{x}_n -\mu) +\sum ( \bar{x}_n - \mu)^2 \\ & = \sum(x_i - \mu)^2 - 2( \bar x _n - \mu)(n \bar{x}_n - n \mu) + n(\bar{x} - \mu )^2 \\ & = \sum(x_i - \mu)^2 - n(\bar x_n - \mu)[ 2(\bar x _n - \mu) - (\bar{x} - \mu ) ] \\ & = \sum(x_i - \mu)^2 - n(\bar x_n - \mu) (\bar x _n - \mu) \\ \end{align}