I've seen the following written on some lecture slides:
Suppose $$\Delta x_{t+1} = x_{t+1} - x_{t} = \varepsilon_{t} \tag{1}$$ where the $\varepsilon_t$ are iid noise terms. Then $$x_t = \sum_{j=0}^\infty \varepsilon_{t-j}. \tag{2}$$
I understand that we can symbolically arrive at $(2)$ by writing
$$x_t = \varepsilon_{t} + x_{t-1} = \sum_{j=0}^T \varepsilon_{t-j} + x_{T-j-1}$$
and so on. But:
- How do we know that $x_{t-j}\to0$ almost surely (I assume this is what we need)
- What does the expression $\sum_{j=0}^\infty \varepsilon_{t-j}$ mean exactly?
Note that if the $\varepsilon_{t}$ are i.i.d. standard Gaussian, for example, then $P(\varepsilon_{t} > 1 \text{ infinitely often})=1$ and $P(\varepsilon_{t} < -1 \text{ infinitely often})=1$ (by Borel-Cantelli). So $\sum_{j=0}^\infty \varepsilon_{t-j}$ should be almost surely undefined!