I define $$Y_t:=\mu+X+W_t,\quad X\sim\mathcal N(0,\sigma_x^2),\quad \epsilon_t\sim\mathcal N(0,\sigma_e^2),\quad \mu\in\mathbb R,$$ and $$S_n:=n^{-1}\sum_{i=1}^n Y_n.$$
I want to compute the variance of $S_n$, and to me becomes natural to write it as:
$$ E[S_t^2-n^2\mu^2]=n^{-2}(n\sigma_x^2+n\sigma_e^2)=n^{-1}(\sigma_x^2+\sigma_e^2).$$
However, the handout I am reading for, it is said to be $\sigma_x^2+n^{-1}\sigma_e^2$. What am I missing? Thank you, and merry Christmas.
I assume the random variables $X$ and $\epsilon_t$ are uncorrelated. Note that
$$ S = \frac {1} {n} \sum_{t=1}^n Y_t = \frac {1} {n} \sum_{t=1}^n (\mu + X + \epsilon_t) = \mu + X + \frac {1} {n} \sum_{t=1}^n\epsilon_t $$
Therefore
$$ Var[S] = Var[X] + \frac {1} {n^2} \sum_{t=1}^n Var[\epsilon_t] = \sigma^2_X + \frac {\sigma^2_e} {n}$$
P.S. You will need to note the difference between $\sum_{t=1}^n\epsilon_t$ and $n\epsilon_1$:
If you are summing uncorrelated random variables: $$ Var\left[\sum_{t=1}^n\epsilon_t\right] = \sum_{t=1}^n Var[\epsilon_t] = \sum_{t=1}^n \sigma_e^2 = n\sigma_e^2 $$
but if you are summing one random variable itself only (which they are perfectly correlated with itself)
$$ Var\left[\sum_{t=1}^n\epsilon_1\right] = Var[n\epsilon_1] = n^2Var[\epsilon_1] = n^2\sigma_e^2$$
In general you need to deal with the covariance terms when you are dealing with variance of sum. Only when the summand are uncorrelated the covariance terms vanish. Otherwise, as in the last example,
$$ Var\left[\sum_{t=1}^n\epsilon_1\right] = \sum_{t=1}^n Var[\epsilon_1] + \sum_{i\neq j}^n Cov[\epsilon_1, \epsilon_1] = n\sigma_e^2 + (n^2 - n)\sigma_e^2 = n^2\sigma_e^2$$
Intuitive speaking, independent random variables can go with different directions so they can offset each other, the variance is smaller than the case that you multiply a single random variable by $n$.