In the following book, the expectation of the estimated slope with linear regression is derived as follows:
$$ \begin{align*} \hat{\beta}_1 ~ &= ~ \frac{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})(Y_i - \bar{Y})}{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})^2} \\ E(\hat{\beta_1}) ~ &= ~ \frac{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})\boldsymbol{E(Y_i - \bar{Y})}}{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})^2} \\ \dots \end{align*} $$
The end of the derivation is non-important and obvious, but I don't understand why the following part is treated as a constant?
$$ ~ \frac{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})}{\frac{1}{n}\sum_{i=1}^n (x_i - \bar{x})^2} $$
which results in expectation operator only being applied to $Y_i - \bar{Y}$. $x_i$ is a random variable too right?