There are several laws of large numbers, but in most versions I've seen there's the usual conclusion $1/n \sum^n_{t=1}u_t \rightarrow^p \mu$, for the usual condition of $E(u_t)=\mu$. However, in the next picture, I don't seem to be able to use the usual LLN.
Result (3.44) is $Var(\hat u_t^2)=E(\hat u_t^2)=(1-h_t)\sigma^2$, and it's also the result they refer to as 'in the previous exercise', and (3.46) is just the first equation, without the plims.
In this picture, if I were to use the usual LLN, I would say that $\hat \sigma^2$ would converge to $E(\hat u_t^2)=(1-h_t)\sigma^2$, but it depends on t...
Any help would be appreciated.

There is no difference between your two conditions (assuming i.i.d random variables). If the expected value of a $X$ is $\mu$ then the expected value of $\bar{X}$ is $\mu$. If the expected value of $\bar{X}$ is $\mu$ then the expected value of $X$ is $\mu$.
$$E(\frac{1}{n} \sum_{i=1}^{n}u_{t}) = \mu \\ E(\frac{1}{n} \sum_{i=1}^{n}u_{t}) = \frac{1}{n} E(\sum_{i=1}^{n}u_{t}) \\ = \frac{1}{n} \sum_{i=1}^{n}E(u_{t}) \\ = \frac{1}{n} n E(u_{t})\\ = E(u_{t}) = \mu $$