Variance of random walk model?

538 Views Asked by At

I'm taking my second term of statistics, and I find myself obsessed with an unnecessary detail...again.

As follows:

$$Y_t=\rho Y_{t-1}+u_t$$

That is to say, we are working with a time series. We assume our present value depends on our previous value. $\rho$ is assumed to be $<1$, so effects of values close in time is greater than the effects of values in the distant past.

$u_t$ is assumed to be a purely random term.The $u_i$s are all assumed to be normally distributed, with zero mean and unit variance.

$Y_0$ (our starting value) is assumed to be 0.

My textbook then claims (out of nowhere)that the variance of this expression is equal to:

$$\frac {1}{1-\rho^2}$$

That's what my question is all about, I am trying to figure out how to reach this expression.

Below, I will describe my (so far unsuccessful) attempt at doing so.


Our first formula is

$$Y_t=\rho Y_{t-1}+u_t$$

This can be rewritten as:

$$Y_t=\sum {\rho^{n-t}u_t}$$

This way the $\rho$ is raised to the power of 0 at $u_n$, 1 at $u_{n-1}$, 2 at $u_{n-2}$ and so forth. $\rho$ will be raised to the power of n-1 at our starting value.

Using the formula for geometric sums, this can be rewritten as:

$$Y_t=\frac{u_t-\rho^n u_t}{1-\rho}$$

Assuming that n approaches infinity (knowing that $|\rho|<1$),we can rewrite this as:

$$Y_t=\frac{u_t}{1-\rho}$$

We take the variance of this expression:

$$V(Y_t)=\frac{\sigma^2}{(1-\rho)^2}$$

Finally, according to the text we have unit variance. I think this means that the variance of each $u_t$ should equal 1.

This would leave us with:

$$V(Y_t)=\frac{1}{(1-\rho)^2}$$

This still isn't equal to $\frac {1}{1-\rho^2}$!

Could someone please tell me what I'm doing wrong?

2

There are 2 best solutions below

4
On BEST ANSWER

You are looking at an AR(1) time series model. By repeated substitution, you get

$$Y_t = \sum_{i=0}^{n-1} \rho^i u_{t-i} + \rho^n Y_{t-n}$$

The first term on the RHS is a geometric sum; the second goes to zero for $|\rho| < 1$ as $n \to \infty$. So take the limit, then take the variance to get the required result.

Note that you use the fact that $u_t$ is "purely random", i.e. the $u_t$ being independent for different $t$'s allows you to take the variance of the sum without any covariance terms popping out (that is, the property $\mathrm{Var}(X + Y) = \mathrm{Var}(X) + \mathrm{Var}(Y)$ applies).

To be explicit, your second-last line of computation should be

$$ \mathrm{Var} (Y_t) = \sum_{i=0}^{\infty} \rho^{2i} \mathrm{Var}(u_{t-i})$$

Edit: Here's the line-by-line:

$$ \begin{align}Y_t &= \rho Y_{t-1} + u_t\\ &= \rho (\rho Y_{t-2} + u_{t-1}) + u_t\\ &= \rho^2 Y_{t-2} + \rho u_{t-1} + u_t\\ & ... \\ &= \sum_{i=0}^{n-1} \rho^i u_{t-i} + \rho^n Y_{t-n} \\ &= \sum_{i=0}^{\infty} \rho^i u_{t-i}\\ \mathrm{Var}(Y_t) &= \mathrm{Var} \left (\sum_{i=0}^{\infty} \rho^i u_{t-i} \right)\\ &=\ \sum_{i=0}^{\infty} \mathrm{Var}\left( \rho^i u_{t-i} \right) \\ &=\ \sum_{i=0}^{\infty} \rho^{2i}\mathrm{Var}\left( u_{t-i} \right) \\ &=\ \sum_{i=0}^{\infty} \rho^{2i}\\ &= \frac{1}{1-\rho^2} \end{align} $$

If you're not happy with writing the last line of the expression for $Y_t$ (since you're dealing with infinity, you might not be convinced that "$\rho^{\infty}Y_{t-\infty}$" is zero, although a heuristic argument is that the decay from $\rho$ is geometric, which is faster than any growth that might possibly happen, which would be linear; in particular, $\rho^n Y_{t-n} \to 0$ almost surely), you'll need to refer to Batman's answer for a more rigorous proof.

For a more intuitive (but still informal) visualisation or argument for justifying writing $Y_t$ as an infinite sum (when $|\rho|<1$), think about what happens to the time series $Y_t = \rho Y_{t-1} + 1$ for when $|\rho|<1$.

4
On

Lets say $u_t \sim N(0, \sigma^2)$.

You have $E[Y_t] = E[\rho Y_{t-1} + u_t] = \rho E[Y_{t-1}] + E[u_t] = \rho E[Y_{t-1}] =0$ (by the initial condition $Y_0=0$).

You also have $E[Y_t^2] = E[(\rho Y_{t-1} + u_t)^2] = E[\rho^2 Y_{t-1}^2 + 2 \rho Y_{t-1} u_t + u_t^2] = \rho^2 E[Y_{t-1}^2] + 2 \rho E[Y_{t-1}]E[u_t] + E[u_t^2] = \rho^2 E[Y_{t-1}^2] + E[u_t^2] = \rho^2 E[Y_{t-1}^2] + \sigma^2$.

Solve these two recurrences (they are both linear recurrences, so use generating functions or guess),and then use $E[Y_t^2] - (E[Y_t])^2$.