According to Gujarati, author notes that in a simple linear equation form $Y_i=\alpha +\beta X_i + \epsilon_i$ where regression model is defined as $\hat Y_i =\hat \alpha + \hat \beta X_i$ OLS method gives us the following:
$$E[(\hat \beta - \beta)\sum_{i=1}^n (x_i - \bar x)(u_i - \bar u)] = \sigma^2$$
However, I can't understand why that equation is true if $E(\hat \beta) = E(\beta) = \beta$? Shouldn't we receive $0$ there when we take estimated values of betas as one can re-write this as: $$\beta E[\sum_{i=1}^n (x_i - \bar x)(u_i - \bar u)] - \beta E[\sum_{i=1}^n (x_i - \bar x)(u_i - \bar u)]$$
What kind of mistake I make while deriving this form? Can you please clarify?
P.S. This equation is used later to calculate that $E(\sum_{i=1}^n \hat u_i^2) = (n-2)\sigma$
The problem is that you cannot bring the random variable $\hat \beta$ outside of the expectation because it is not uncorrelated with the $X_i$.