Finding variance of a function

42 Views Asked by At

How can I show that the variance of

$$y_t=\sum_{j=0}^\infty \varphi_j\varepsilon_{t-j}$$

where $\operatorname{E}(\varepsilon_t)=0, \operatorname{E}(\varepsilon^2_t) = \sigma^2, \operatorname{E}(\varepsilon_t\varepsilon_s)=0$ for $s\ne t $ and $\sum_{j=0}^\infty \varphi^2_j<\infty$ is the following

$$\operatorname{Var}(y_t)=\sigma^2\sum_{j=0}^\infty \varphi^2_j$$

I am struggling with summation signs.

2

There are 2 best solutions below

1
On BEST ANSWER

Since $\operatorname{E}(\varepsilon_t)=0,$ you have $\operatorname{var}(\varepsilon_t) = \operatorname{E}(\varepsilon_t^2)$ and $\operatorname{cov}(\varepsilon_t,\varepsilon_s) = \operatorname{E}(\varepsilon_t \varepsilon_s).$ And since $$ \operatorname{E}\left( \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right) = 0, $$ you have $$ \operatorname{var} \left( \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right) = \operatorname{E} \left( \left( \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right)^2 \right) $$ and $$ \operatorname{var} (\varepsilon) = \operatorname{E}(\varepsilon^2) \quad \text{and} \quad \operatorname{cov}(\varepsilon_k,\varepsilon_\ell) = \operatorname{E}(\varepsilon_k\varepsilon_\ell). $$

So you have $$ \left( \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right)^2 = \sum_j \varphi_j^2 \varepsilon_{t-j}^2 + 2 \sum_{k,\ell\,:\, k\,<\,\ell} \varphi_k\varphi_\ell \varepsilon_{t-j}\varepsilon_{t-\ell} $$ Taking expected values of both sides and applying linearity of expectation, you get $$ \operatorname{E}\left(\left( \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right)^2\right) = \sum_{j=1}^N \varphi_j^2 \operatorname{E}(\varepsilon_{t-j}^2). $$ That the limit on the right exists is given. Therefore the limit on the left exists and is the same. But next we have the question of whether the limit of the expression on the left equals the expected value of the square of the limit of the sum on the inside, i.e can we say that $$ \lim_{N\to\infty} \operatorname{E}\left(\left( \sum_{j=0}^N \varphi_j \varepsilon_{t-j} \right)^2\right) = \operatorname{E}\left(\left( \lim_{N\to\infty} \sum_{j=0}^N \varphi_j\varepsilon_{t-j} \right)^2\right) \text{ ?} $$ I think the dominated convergence theorem will handle that.

0
On

$$E[\lvert y_t\rvert] \leq E \left[ \sum_{j=0}^\infty \lvert\varphi_j\varepsilon_{t-j}\rvert \right] = \sum_{j=0}^\infty \lvert\varphi_j \rvert E[\lvert\varepsilon_{t-j}\rvert] \leq \sigma \sum_{j=0}^\infty \lvert\varphi_j\rvert$$

The first equality above is Tonelli (or monotone convergence). The last inequality comes from Cauchy-Schwarz.

I am fairly certain that you need absolute summability of $(\varphi_j)$, not just square summability. Assuming that you have absolute summability, you have $y_t$ being integrable by the chain of relations above. So then by Fubini (or dominated convergence),

$$E[y_t] = \sum_{j=0}^\infty \varphi_j E[\varepsilon_{t-j}] = 0$$

We also need $E[y_t^2]$ to compute the variance. Just use the same arguments above to justify interchanging expectation with the double infinite sum. $$E[y_t^2] = \sum_{j=0}^{\infty}\sum_{k=0}^\infty\varphi_j\varphi_k\underbrace{E[\varepsilon_j\varepsilon_k]}_{\sigma^2 \text{ if } j = k, \ 0 \text{ otherwise} }$$ Hence, $E[y_t^2] = \sigma^2\sum_{j=0}^{\infty}\varphi_j^2$.